top | item 46986510

(no title)

XenophileJKO | 19 days ago

That is because nothing in the world is deterministic, they are just all varying degrees of probability.

discuss

order

pyrale|19 days ago

This rings hollow to me.

When my code compiles in the evening, it also compiles the next morning. When my code stops compiling, usually I can track the issue in the way my build changed.

Sure, my laptop may die while I'm working and so the second compilation may not end because of that, but that's not really comparable to a LLM giving me three different answers when given the same prompt three times. Saying that nothing is deterministic buries the distinction between these two behaviours.

Deterministic tools is something the developper community has worked very hard for in the past, and it's sad to see a new tool giving none of it.

latexr|19 days ago

That is called a deepity: a statement which sounds profound but is ultimately trivial and meaningless.

https://rationalwiki.org/wiki/Deepity

Determinism concerns itself with the predictability of the future from past and present states. If nothing were deterministic, you wouldn’t be able to set your clock or plan when to sow and when to harvest. You wouldn’t be able to drive a car or rest a glass on a table. You wouldn’t be able to type the exact same code today and tomorrow and trust it to compile identically. The only reason you can debug code is determinism, it is because you can make a prediction of what should happen and by inspecting what did happen you can can deduce what went wrong several steps before.

XenophileJKO|18 days ago

Can you predict when solar radiation hits a memory cell or when given server node will die? Not really but you can model the probability of it happening. My point was all the systems we work with have failure modes and non-determinate output at some rate. That rate might be really small.. but at what point does the rate of non-deterministic behavior make something "non-deterministic. A language model can be deterministic in that you can get the same output from the same input if you so desire (again baring systemic failures and mitigating for out of order floating point operations).

I think just philosophically it is interesting because any real system is not fully predictable.. we just choose some arbitrary threshold of accuracy to define it as such.