top | item 47156898

(no title)

josefritzishere | 5 days ago

LLMs don't really do random.

discuss

order

rishabhaiover|5 days ago

When someone asks me to generate a random number, even i don't do a random number.

bogzz|5 days ago

I used to always reflexively blurt out 67 when asked for a random number.

I'm a proto gen alpha. I 6-7'd before it was cool.

minimaxir|5 days ago

There's some statistical nuance here. LLMs output predicted probabilities of the next token, but no modern LLM predicts the next token by taking the highest probability (temperature = 0.0), but instead uses it as a sampling distribution (temperature = 1.0). Therefore, output will never be truly deterministic unless it somehow always predicts 1.0 for a given token in a sequence.

With the advancements in LLM posttraining, they have gotten better at assigning higher probabilities to a specific token which will make it less random, but it's still random.