top | item 46921294

(no title)

hackinthebochs | 24 days ago

Yes, there's some unknown sources of non-determinism when running production LLM architectures at full capacity. But that's completely irrelevant to the point. The core algorithm is deterministic. And you're still conflating deterministic and predictable. It's strange to have such disregard for the meaning of words and their correct usage.

discuss

order

rvz|24 days ago

> Yes, there's some unknown sources of non-determinism when running production LLM architectures at full capacity. But that's completely irrelevant to the point.

It is directly relevant and supports my whole point which just debunked your assertions on LLMs being ‘deterministic’ which doesn’t exist at a fundamental sense which you can’t guarantee that the behaviour and even the outputs will be the same.

> The core algorithm is deterministic. And you're still conflating deterministic and predictable.

The entire LLM is still non-deterministic and it is still considered to be unpredictable even if you take that to account.

> It's strange to have such disregard for the meaning of words and their correct usage.

Nope. Not only you have shown absolutely zero sources at all to prove the deterministic nature of LLMs to where it can function as a “compiler”, you ultimately conceded by agreeing with the linked paper(s) recognising that LLMs still do not have deterministic or predictable properties at all; even if you tweak the temp, parameters, etc.

Therefore, once again LLMs are NOT compilers as even feeding them adversarial inputs can mess up the entire network up to become useless.

hackinthebochs|24 days ago

>Not only you have shown absolutely zero sources at all to prove the deterministic nature of LLMs to where it can function as a “compiler”

Note that I never defended using LLMs as a compiler. In fact I argued it would be inappropriate. I simply disagreed that the reason is because they are non-deterministic. If you weren't conflating the meaning of deterministic and predictable, you wouldn't keep misreading me.

WithinReason|24 days ago

from your first source:

> As a consequence, the model is no longer deterministic at the sequence-level, but only at the batch-level

therefore they are deterministic when the batch size is 1

Your second source lists a large number of ways how to make LLMs determnistic. The title of your third source is "Defeating Nondeterminism in LLM Inference" which also means that they can be made deterministic.

Every single one of your sources proves you wrong, so no more sources need to be cited.