top | item 20257534

(no title)

FungalRaincloud | 6 years ago

Trust, to me, is not the problem. You can build trust. Known-good certificates can be distributed physically, and require signed messages for replacement. Or, we can develop schemes for distribution digitally via validated channels. For example, each worker at a company has a particular known-good digital presence, verified by their own public key, and distribution happens with them as the source, essentially creating an expanding ring of trust to the key being distributed. Violating such a ring of trust is not going to be easy, if it is well enough built.

There are two issues I do see, though, and they're kind of the same issue. Right now, we have this concept of a central store of public certificates. It makes it easy for you to get a certificate for a particular entity, but it also makes the central store a target. If you can compromise a central store (or a machine that is attempting to access said central store), you probably have the resources to at least redirect the user to your own site and leave them none-the-wiser, and you probably have the resources to man-in-the-middle their connection entirely and just snoop your heart out. So central stores of trust are a bit of an issue, and the ways around that are non-trivial to set up. A good example is probably KeyBase, who allow you to certify your various online presences with your private key. So if someone wants to replace your information on KeyBase with their own one, and they have the resources to do so, now they also have to compromise all the places you've distributed that key to. Or, they have to compromise one of those centralized stores of trust....

The big issue with centralized stores of trust is that they build blind trust. That's the big issue with humans in general, though. We don't want to question what we're watching. And we probably don't want to be bothered with validating that the "trusted source" of the certificate used to sign this content is actually _trusted_. It's just too much mental overhead. We want it to be automatic. We want central stores of trust, because it's just _easier_. The work is going to be convincing people that _easier_ is dangerous, in this case. Or, it's going to be to convince software companies to build in inconvenient technology and not make it trivial to turn off.

discuss

order

intended|6 years ago

“Easier” is the whole point of the society you live in.

To be fair, the point of society is trust. It’s a way to trust information and ensure the species is safe.

The whole point of using markets and capitalism is because they generate more trust worthy results than top down driven systems.

Until this mess, which makes it seem like a central authority will be better than a system that leaves nodes on the three open for manipulation.

Essentially, we had a distributed decision making society. Now we found a hack to break that society structure. The cost for such a society to manage verification is absurdly high - every person will have to spend non trivial effort to verify that they are not being manipulated.

In contrast central decision making societies like China will just avoid that cost and be more competitive, beating out western democratic systems.