top | item 43674952 (no title) ANewFormation | 10 months ago LLMs are 100% deterministic. The facade of randomness is injected solely by a superfluous rng factor. discuss order hn newest mdp2021|10 months ago Even if in the forward pass there would be no "temperature" tilting, the NN training would still be performed through different processes on different implementations, making the outputs "personal".
mdp2021|10 months ago Even if in the forward pass there would be no "temperature" tilting, the NN training would still be performed through different processes on different implementations, making the outputs "personal".
mdp2021|10 months ago