(no title)
barnacs | 5 months ago
Even if it can extrapolate to some degree (altough that's where "hallucinations" tend to become obvious), it could never, for example, invent a game like chess or a social construct like a legal system. Those require motivations like "boredom", "being social", having a "need for safety".
chpatrick|5 months ago
> it could never, for example, invent a game like chess or a social construct like a legal system. Those require motivations like "boredom", "being social", having a "need for safety".
That's creativity which is a different question from thinking.
bluefirebrand|5 months ago
Humans invent new data, humans observe things and create new data. That's where all the stuff the LLMs are trained on came from.
> That's creativity which is a different question from thinking
It's not really though. The process is the same or similar enough don't you think?
barnacs|5 months ago
Yes, humans are also capable of learning in a similar fashion and imitating, even extrapolating from a learned function. But I wouldn't call that intelligent, thinking behavior, even if performed by a human.
But no human would ever perform like that, without trying to intuitively understand the motivations of the humans they learned from, and naturally intermingling the performance with their own motivations.