LLM hallucinations in fact has a positive side effect too, if you are using them for learning some subject; makes you verify their claims, and finding errors in them is very rewarding.
When trying to learn a subject I find being able to ask my specific questions and getting a specific answer back is helpful. I find books tend to be laborious and filled with frankly filler, often poorly indexed, and when my question isn’t covered in the book I’m left with no recourse other than googling through SEO wastelands or on topic forum questions with off topic replies. At least with LLMs they always have an answer that’s got enough of the truth in it to give me a direction, or often when I’ve gone into an area with genuinely no known answers or the thing doesn’t exist the answer is easily verified as wrong - but that process, as was pointed out above, teaches me a lot too. I actually prefer the mistakes it makes because it forces me to really learn - even to the point of giving me things to look up in the index of a book.
Treating LLMs as a single source of truth and a monolithic resource is as bad an idea as excluding them as a tool in learning.
skydhash|1 year ago
fnordpiglet|1 year ago
Treating LLMs as a single source of truth and a monolithic resource is as bad an idea as excluding them as a tool in learning.
IWeldMelons|1 year ago