top | item 46932142

(no title)

gnatolf | 22 days ago

That is a wobbly assertion. You certainly would need to run the same compiler, forgo any recent optimisations, architecture updates and the likes if your code has numerical sensitive parts.

You certainly can get identical results, but it's equally certainly not going to be that simple a path frequently.

discuss

order

AlexeyBrin|22 days ago

> You certainly can get identical results, but it's equally certainly not going to be that simple a path frequently.

But at least I know that if I need to, I can do it. With an LLM, if you don't store the original weights, all bets are off. Reproducibility of results can be a hard requirement in certain cases or industries.

layer8|22 days ago

The more important point is that even when you don’t get identical binary output, you still get identical observable behavior as specified by the programming language, unless there’s a compiler bug. That’s not the case for LLMs, they are more like an always randomly buggy compiler. You wouldn’t want to use such a compiler.