(no title)
keepper | 1 year ago
The oddity is that LLMs are sounding... too... Human... the reacting to information and regurgitating it a bit too much like the average "learned person". Adding a ton of extrapolation of the "facts", just as we would.
LLMs sound like any pundit on any random tv show/newspaper/blog. The goal was never "fact", it was sounding human, and "intelligence", for which the definition in this case has been.. sounding human. Not being right.
Now, the question of whether they should or shouldn't do this...
No comments yet.