top | item 44815922

(no title)

tshannon | 6 months ago

So probably another stupid question, but how do you know what it's spitting out is accurate?

discuss

order

tkgally|6 months ago

One has to be aware of the possibility of hallucinations, of course. But I have not encountered any hallucinations in these sorts of interactions with the current leading models. Questions like "what does 'embedding space' mean in the abstract of this paper?" yield answers that, in my experience, make sense in the context and check out when compared with other sources. I would be more cautious if I were using smaller models or if I were asking questions about obscure information without supporting context.

Also, most of my questions are not about specific facts but about higher-level concepts. For ML-related topics, at least, the responses check out.