Dec 30, 2017

"Reflections on Trusting Trust": https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Dec 04, 2017

I think you are referring to “Reflections on Trusting Trust” by Ken Thompson.

https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Nov 23, 2017

>I don't follow your logic there, care to elaborate? Banning is done by Apple; legality is determined by courts, based on laws. Has any of these apps ruled illegal by court? Of course not. As I already wrote, company policies are not 1:1 map to laws, there's much more that goes into them, especially things like business interests and partnerships, but also things like ideology or subjective moral judgement.<

This is literally the third line of the article: "We have been notified by the Ministry of Public Security that a number of voice over internet protocol apps do not comply with local law. Therefore these apps have been removed from the app store in China."

>You can't be serious. So your grandmother is going to found a company, then get a DUNS number, so she can sideload an app?<

You said you can't side-load apps and that's the proof that you can. This is how companies deploy apps that are not on the App Store. And this has nothing to do with grandparents, it's two separate things. I remain unconvinced that it's easier to sideload an app from dubious sources than downloading from a sanctioned App Store.

>The second link says exactly nothing about sideloading. On contrary, it has big Apple Store button.<

The App Store is the official way to get into the Cardiaogram program. You can join the mRhythm study which is not offered on the App Store. They send you an email link and you tap on the link. Then you download the profile and the app. And that's how you sideload apps.

So I've provided 2 real-life examples of how side-loading is done on iOS.

>You don't seem to understand, that curated marketplace and sideloading are not mutually exclusive. Those, who want that marketplace, can choose from curated selection. Those, who want to sideload, can. In your model, where the curation is enforced on everyone, it is being turned into control for what's allowed and what is not.<

I agree that in an ideal world, having both a curated marketplace/walled garden and the option to sideload would be good. In practice, the closest to this idealized model is actually iOS and not Android, because Android even in its most "official" form is sponsored by a company whose business is to spy on its users (refer to earlier citation about being busted by Quartz). We can keep arguing in circles about "open source" and "code audits" but Ken Thompson pretty much shut that down with his Turing award lecture. [1] As of a few days ago, Google has consistently been shown to be untrustworthy.

[1] https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Nov 21, 2017

This reminded me of the famous lecture by Ken Thompson, Reflections on Trusting Trust:

https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Basically you have to trust the compiler because it compiles all code on your system, including itself. Not entirely the same, and I think the Intel trick is more nefarious.

Sep 19, 2017

https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Sep 12, 2017

Your link doesn't seem to work for me, but I presume you're talking about Reflections On Trusting Trust[0]?

[0]: https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Sep 12, 2017

https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

(Reflections on Trusting Trust)

Aug 24, 2017

"Reflections on Trusting Trust" - Ken Thompson

https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Aug 13, 2017

"Trusting Trust" in the wild!? Nope. Just some fiction.

https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Aug 09, 2017

> While the paper I handled is stored in the machine, I am sure that the results are transmitted to the next link in the chain through some computer system.

How can you be sure about that?

> With so many links in the chain, it's my opinion that it's unreasonable to expect them all to be processed by people. It won't scale and I'm not convinced that it's that much safer anyway.

The point is that if you are not convinced, you can go and observe the process. The point is to remove as much trust as possible. The point is not to just have some human in the loop, but to make sure that people who distrust each other can personally make sure that the correct procedure is being followed.

> It would be my preference that the pieces of the system that perform this processing are backed with open source software.

The problem is that you have no way to verify that what is actually processing your vote is the open source software that you hope it is.

See also Ken Thompson's classic "reflections on trusting trust":

https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

> At the very least, if there is a case where tampering is suspected, officials of the court can compare the software on the machine with the software in the repository.

No, they can't. The only way to check what software is running on the computer is to use software that is running on the computer, which is thus also suspect. That is, short of decapping each and every chip in the one computer that you are trying to check and extracting all the circuitry and all storage bits in it.

> As painful as it is, I think we all need to trust the state, to some degree, to do the jobs that are the responsibility of the state.

But ensuring trustworthiness of elections is not one of those. Elections are the anchor that all the other trust that we put into democratically elected governments is anchored at, it's the one lever that we have to remove governments that turn out to not be trustworthy. You cannot trust the government to remove itself in case you want to have it replaced.

> Once the votes have been tallied for a district, isn't it possible to tamper with them as they are transmitted up the chain to the next link in the processing?

If the election is run properly: No.

Represenatives from each party will be observing the election process at every polling station, and the general public can usually also observe if they wish to, from opening until the votes are counted. Also, election results should generally be published broken down by polling station, so each of the observers can check that what they observed at their polling station actually matched what went into the total.

There is absolutely no place for trust in elections.

Aug 03, 2017

Just no.

Very relevant to this topic: Ken Thompson's "Reflections on Trusting Trust":

https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Apr 24, 2017

I don't know of a list but people have been aware of the potential for decades: https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Apr 12, 2017

The problem with voting machines is so much deeper than 'hacking the machine'. It's a matter of trust of the code and perhaps more importantly, all of the people and process that are involved in the software, hardware, and running the actual election.

Is the source open, and has it been audited? Has the tool chain been audited (eg: the attack described in Reflections on Trusting Trust [1])? Are they using reproducible builds[2]?

This includes not just the software loaded on the machine itself, but the tallying software used to count all the results.

Even if you can verify the code contains nothing like a "defeat device" [3] (eg: detect it's actually election day and only then enable vote-stealing mode), how do you know what's actually being used?

How does a voter verify that the build running on the machine is actually valid and the expected one? Even if the voter has to trust the people running the election, how do those people verify it? If all the polling stations load software onto the machines on election day (to ensure it's the right software), that opens up the possibility of someone injecting their own bad software. If they have to rely on a central organization loading the machines, there's a whole delegation of trust happening and being concentrated in one place -- easier to verify, in some ways, but also easier to compromise.

So the only way to run a valid contest is to provide access to the entire process. Can I modify the software used to tabulate? Can I act like I working at the company providing the software/hardware and have access to the code, build process and signing keys?

If that's possible, and you can still detect cheating, then that's great, but I also fear it's an arms race with no end, and it's just a matter of one-upping the other side.

[1] https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

[2] https://reproducible-builds.org/

[3] https://en.wikipedia.org/wiki/Defeat_device

Mar 07, 2017

Though, of course, this still means trusting the compiler: https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Feb 21, 2017

> Even for opensource projects there's no guarantee that the published version of the app matches anything in the commit history.

But at least for opensource, if you are willing you can build your own binary, using your own tools, from the source in the commit history, and get an app that matches the commit history [1].

[1] Exclusive of the issues detailed in "Reflections on Trusting Trust" by Ken Thompson regarding the actual build tools themselves (https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...)

Feb 13, 2017

> It may be, but there's no guarantee it behaves the way it claims to. There's no guarantee it's not backdoored. There are powerful actors involved in these areas.

It renders your point about source code moot though, doesn't it. Security is ultimately the art of trust propagation.

> Care to cite some of this literature?

The most famous discourse here is the "untrustworthy compiler problem." Most famous citation is by none other than Thompson: https://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thomp...

Trust chains are their weakest link, and people often put a lot of trust in compilers without really asking what it is doing. Not unlike crypto, we're told not to roll our own.

People have proposed ways around this, but they're not very good (http://imgur.com/a/BWbnU#0). The moral of the story is that at some point, you extend trust to someone. Security is never absolute.