(no title)
garymarcus | 9 months ago
The results aren’t pretty. 0/5, no two maps alike.
“Smart” means understanding abstract concepts and combining them well, not just retrieving and analogizing in shoddy ways.
No way could a system this is wonky actually get a PhD in geography. Or economics. Or much of anything else.
rvz|9 months ago
It's not there yet, it's still learning™, but a lot of progress in AI has happened recently, which I would give them that.
However, as you point out in your newsletter already, there are also lots of misleading and dubious claims alongside too much hype in the hopes to raise VC capital which comes with the overpromising in AI as well.
One of them is the true meaning of "AGI" (right now it is starting to look like a scam), since there are several conflicting definitions directly from those who benefit.
What do you think it truly means given your observations?
enjoylife|9 months ago
some_random|9 months ago
ben_w|9 months ago
I wouldn't call LLMs "smart" either, but with a different definition than the one you use here: to me, at the moment, "smart" means being able to learn efficiently, with few examples needed to master a new challenge.
This may not be sufficient, but it does avoid any circular arguments about if any given model would have any "understanding" at all.
yoko888|9 months ago
[deleted]
knowsuchagency|9 months ago
lillecarl|9 months ago
jqpabc123|9 months ago
As should be expected, sometimes it predicts correctly and sometimes it doesn't.
It's kinda like FSD mode in a Tesla. If you're not willing to bet your life on it (and why would you?), it's really not all that useful.