top | item 44858363

(no title)

PopePompus | 6 months ago

By default ChatGPT et al. will not produce the same output if identical input is fed to it multiple times. There is "stochastic sampling" which means that the most probable next token is not always selected. It will select from tokens of similar probability. The degree of similarity required is controlled by the temperature parameter. If the temperature is set to 0, then the model will reproducibility produce the same output (assuming it's always running on the same hardware and the model weights don't get tweeked by an update). But the chatbox front ends do not have the temperature set to 0.

discuss

order

No comments yet.