top | item 35035162

(no title)

jpwagner | 3 years ago

At some point the bibliography needs to reference the LLM itself which would need to be hosted indefinitely.

discuss

order

WolfOliver|3 years ago

I do not think it is feasible to make sure prompts are reproducible. Considering that a LLMs are large you can not host every version of the model indefinitely.

m3galinux|3 years ago

ChatGPT when asked this question answers that its responses are probabilistic, so the responses aren't reproducible. I tested that myself, of course. Since it gave me 2 different (but overall equivalent) answers from the same prompt I'd have to agree.

jpwagner|3 years ago

if it's not reproducible, it's not science