(no title)
inor0gu | 1 year ago
This makes that entire goal moot; eliminating trust thus seems impossible, you're just shifting around the things you're willing to trust, or hide them behind an abstraction.
I think what will become more important is to have enough mechanisms to be able to categorically prove if an entity you trust to a certain extent is acting maliciously, and hold them accountable. If economic incentives are not enough to trust a "big guy", what remains is to give all the "little guys" a good enough loudspeaker to point distrust.
A few examples: - certificate transparency logs so your traffic is not MitM'ed - reproducible builds so the binary you get matches the public open source code you expect it does (regardless of its quality) - key transparency, so when you chat with someone on WhatsApp/Signal/iMessage you actually get the public keys you expect and not the NSA's
parhamn|1 year ago
I agree. Perhaps it's why I find the discussions like nonce-lengths and randomness sources almost insane (in the sense of willfully missing the forrest from the trees). Intelligence agencies have managed to penetrate the most secretive and powerful organizations known to man. Why would one think Signal's supply chain is impervious? I'd assume the opposite.
inor0gu|1 year ago
> If you're building a chip to generate prime numbers I do surely hope you know how to select randomness or make constant time & branch free algorithms, just like an engineer designing elevators better know what should be the tensile strength of the cable it'll use. In either cases, it's mumbo jumbo for me, and I just need to get on with my day.
Part of what muddies the water is our collective inability to separate the two contexts, or empower tech communicators to do it. If we keep making new tech akin to esoteric magic, no one will board the elevator.
542354234235|1 year ago