top | item 8759434

(no title)

mrpdaemon | 11 years ago

The chain of trust doesn't quite stop at compiling the source, in order to be really sure that nothing unintended is going on you have to compile the compiler yourself. At the end of the day you will have to trust some bootstrapping binary compiler unless you put it together yourself in machine language.

discuss

order

M2Ys4U|11 years ago

Actually, you don't.

You can use two different compilers that compile each other to prove that the compilation won't be tampered with.

See https://www.schneier.com/blog/archives/2006/01/countering_tr...

kijin|11 years ago

What if both compilers are backdoored?

It's not like you have a large choice of good compilers for any given language/platform pair.

avz|11 years ago

Good point. Given sufficient paranoia this train of suspicion can be continued even deeper down the rabbit hole: you'd need to inspect the hardware designs and make sure the hardware you've got was actually produced according to the inspected designs.

In technology as elsewhere, it seems life is ultimately based on trust in someone.

john61|11 years ago

> In technology as elsewhere, it seems life is ultimately based on trust in someone.

trust is a function of control. With free software it is distributed trust and control. With proprietary Sw it is centralized trust and control.

Real life proved that centralized control is a bad idea, that is why we invented democracy and free software.

oneeyedpigeon|11 years ago

Contrariwise, the nefarious app has to trust that I haven't got a whole bunch of purposefully misleading data that I'm feeding it.