top | item 44852115

(no title)

arolihas | 6 months ago

It doesn't "know" anything. Everything that comes out is a hallucination contingent on the prompt.

discuss

order

brokencode|6 months ago

You could say the same about humans. Have you ever misremembered something that you thought you knew?

Sure, typically we don’t invent totally made up names, but we certainly do make mistakes. Our memory can be quite hazy and unreliable as well.

malloryerik|6 months ago

Humans have a direct connection to our world through sensation and valence, pleasure, pain, then fear, hope, desire, up to love. Our consciousness is animal and as much or more pre-linguistic as linguistic. This grounds our symbolic language and is what attaches it to real life. We can feel instantly that we know or don't know. Yes we make errors and hallucinate, but I'm not going to make up an API out of the blue; I'll know by feeling that what I'm doing is mistaken.

mcswell|6 months ago

Humans do many things that are not remembering. Every time a high school geometry student comes up with a proof as a homework exercise, or every time a real mathematician comes up with a proof, that is not remembering; rather, it is thinking of something they never heard. (Well, except for Lobachevsky--at least according to Tom Lehrer.) The same when we make a plan for something we've never done before, whether it's a picnic at a new park or setting up the bedroom for a new baby. It's not remembering, even though it may involve remembering about places we've seen or picnics we've had before.

arolihas|6 months ago

Do you genuinely believe that humans just hallucinate everything? When you or I say my favorite ice cream flavor is vanilla, is that just a hallucination? If ChatGPT were to say their favorite ice cream flavor is vanilla, are you taking it with equal weight? Come on.

yahoozoo|6 months ago

Can we please stop with the “same for humans!”

BlueTemplar|6 months ago

(Unlike machines trying to replicate visual systems) LLMs don't hallucinate : they bullshit.