I’ve been daydreaming lately about what the fundamental limits of “intelligence” could be, something like the concept of computability but for AI, or even biological brains.
Though I will say, surely the existence of the human brain (which by definition is general intelligence), suggests that creating AGI is fundamentally possible?
What evidence did we have that LLMs would be such transformative techs before they were suddenly introduced, and have such surprising behaviors? Not sure we need to always be looking for evidence for potentially surprising and disruptive tech
They can "feel it", like people "felt" we'd have commercial space flight "soon" after we put people on the moon, it's all delusion and wishful thinking.
It's worse than that, really, because there was at least a fairly obvious _path_ there, even if the economics were, to say the least, shaky. For AGI... not so much.
dsr_|5 months ago
(I'm not denying the possiblity. I'm proclaiming a lack of evidence.)
slaterbug|5 months ago
Though I will say, surely the existence of the human brain (which by definition is general intelligence), suggests that creating AGI is fundamentally possible?
clueless|5 months ago
lm28469|5 months ago
rsynnott|5 months ago