(no title)
LockAndLol | 5 years ago
What about websites though? Hash-summed files aren't going to save us, because resources can be loaded dynamically and the client can't know the hash before retrieval.
Reproducible builds would be a great first start. Forcing governments to use opensource may be another step.
dane-pgp|5 years ago
It is possible for a web page to specify the expected hash of a script file, which the browser will enforce. This is called SRI (Subresource Integrity).[0]
Of course that still leaves the bootstrapping problem of how the page itself can be guaranteed to have a specific hash, but fortunately there is a clever hack that can be done with bookmarklets[1], or the page can just be saved and loaded/served locally.
While that works technically, the UX isn't great because the address bar won't show the domain of the remote server (although browsers seem to be hiding the address bar from the user more and more). A better solution would be for browsers to support Hashlinks[2], which would allow a bookmark to point to a remote page with fixed contents.
[0] https://developer.mozilla.org/en-US/docs/Web/Security/Subres...
[1] https://news.ycombinator.com/item?id=17776456
[2] https://github.com/w3c-ccg/hashlink