top | item 40823783

(no title)

extheat | 1 year ago

People simply have no idea what they're talking about. It's just jumping on to the latest hype train. My first impression here was per the name that it was actually some sort of compiler in it of itself--ie programming language in and pure machine code or some other IR out. It's got bits and pieces of that here and there but that's not what it really is at all. It's more of a predictive engine for an optimizer and not a very generalized one for that.

What would be more interesting is training a large model on pure (code, assembly) pairs like a normal translation task. Presumably a very generalized model would be good at even doing the inverse: given some assembly, write code that will produce the given assembly. Unlike human language there is a finite set of possible correct answers here and you have the convenience of being able to generate synthetic data for cheap. I think optimizations would arise as a natural side effect this way: if there's multiple trees of possible generations (like choosing between logits in an LLM) you could try different branches to see what's smaller in terms of byte code or faster in terms of execution.

discuss

order

hughleat|1 year ago

It can emulate the compiler (IR + passes -> IR or ASM).

> What would be more interesting is training a large model on pure (code, assembly) pairs like a normal translation task.

It is that.

> Presumably a very generalized model would be good at even doing the inverse: given some assembly, write code that will produce the given assembly.

Is has been trained to disassemble. It is much, much better than other models at that.

quonn|1 year ago

> Presumably a very generalized model would be good at even doing the inverse: given some assembly, write code that will produce the given assembly.

ChatGPT does this, unreliably.