top | item 43045289

(no title)

white-flame | 1 year ago

The real problem is deeper than this. The actual question to ask is:

"What if we just stopped distributing and blindly executing untrusted binary blobs?"

A trusted compiler in the OS, and some set of intermediate representations for code distribution would solve a massive amount of security issues, increase compatibility, and allow for future performance increases and disallowing suspect code patterns (spectre, rowhammer, etc). Specializing programs at install time for the local hardware makes way more sense than being locked into hardware machine code compatibility.

discuss

order

tsimionescu|1 year ago

That does nothing to fix the vast majority of security issues, which are caused by trusted but not memory safe programs running on untrusted input.

It's also an extremely unrealistic goal. First of all, you run into a massive problem with companies and copyright. Second of all, it will be very hard to convince users that it's normal for their Chrome installation to take half an hour or more while using their CPU at 100% the whole time.

bawolff|1 year ago

I feel like you might as well ask "why not world peace"?

There are a huge number of practical issues to be solved to make that be viable.

grayhatter|1 year ago

could you list a few of the problems you predict?

Taikonerd|1 year ago

The VST Lab at Princeton works on this sort of problem: https://vst.cs.princeton.edu/

"The Verified Software Toolchain project assures with machine-checked proofs that the assertions claimed at the top of the toolchain really hold in the machine-language program, running in the operating-system context."

Some of the same researchers worked on TAL (typed assembly language), which sounds like it could be one of the "intermediate representations" you mentioned.

transpute|1 year ago

For a while, Apple required apps to be submitted as bitcode (LLVM IR) to the App Store, where they would be converted to x86 or Arm machine code during device install. They stopped that a couple of years ago, after migration to Apple Silicon.

refulgentis|1 year ago

Apple used to require bitcode (LLVM IR) for App Store submissions.

Rest is interesting, nothing was done on install, it wasn't converted or anything to machine code.

In fact, in practice, it never ended up being used.

Well, that's not particularly relevant: the idea was never to do something on device anyway.

Really excellent post here summarizing that I can vouch for: https://stackoverflow.com/questions/72543728/xcode-14-deprec...

arbitrandomuser|1 year ago

How did that work though ? Isn't bitcode tied to a target triplet ?

saagarjha|1 year ago

Bitcode was never required for app submissions on macOS. It also was never portable across architectures.

watt|1 year ago

There is a rule that if somebody poses this hypothetical with the word "just" in it, they have signed themselves up to go and implement it.

So, congratulations, take it and run with it.