top | item 47107222

(no title)

a-dub | 8 days ago

> To me that implies the input isn't deterministic, not the compiler itself

or the system upon which the compiler is built (as well as the compiler itself) has made some practical trade offs.

the source file contents are usually deterministic. the order in which they're read and combined and build-time metadata injections often are not (and can be quite difficult to make so).

discuss

order

overgard|8 days ago

I mean, if you turn off incremental compilation and build in a container (or some other "clean room" environment), it should turn out the same each time. Local builds are very non-deterministic, but CI/CD shouldn't be.

Either way it's a nitpick though, a compiler hypothetically can be deterministic, an LLM just isn't? I don't think that's even a criticism of LLMs, it's just that comparing the output of a compiler to the output of an LLM is a bad analogy.

a-dub|7 days ago

> I mean, if you turn off incremental compilation and build in a container (or some other "clean room" environment), it should turn out the same each time. Local builds are very non-deterministic, but CI/CD shouldn't be.

lol, should. i believe you have to control the clock as well and even then non-determinism can still be introduced by scheduler noise. maybe it's better now, but it used to be very painful.

> Either way it's a nitpick though, a compiler hypothetically can be deterministic, an LLM just isn't? I don't think that's even a criticism of LLMs, it's just that comparing the output of a compiler to the output of an LLM is a bad analogy.

llm inference is literally sampling a distribution. the core distinction is real though, llms are stochastic general computation where traditional programming is deterministic in spirit. llm inference can hypothetically be deterministic as well if you use a fixed seed, although, like non-trivial software builds on modern operating systems, squeezing out all the entropy is a non-trivial affair. (some research labs are focused on just that, deterministic llm inference.)