(no title)
cel1ne | 5 years ago
We don't really know what "understanding of the world" means in humans. We just "see it when it's there".
We might be radically different from GPT-3, or we might not. Our way of learning is different in any way.
Something that came to my mind: Various GPT-3 answers resemble answers given by children: Mostly correct, but having misunderstood some crucial point.
In real human learning and conversation these points are easily corrected by feedback by explanation: "You see, the point is no one wears bathing suites to work".
Which would then be incorporated as new wisdom.
Maybe this feedback-mechanism is what GPT-3 is missing. Maybe we should talk to it.
No comments yet.