(no title)
lb4r | 2 years ago
>There is no concrete definition so there is no concrete way of deciding if something is intelligent.
You say there is no concrete way of deciding if something is intelligent, yet you yourself have decided that LLMs are not intelligent.
brandonwamboldt|2 years ago
lb4r|2 years ago
nuancebydefault|2 years ago
calibas|2 years ago
He's not saying there's no way of judging intelligence, he's just pointing out there's no universal agreement on what intelligence even is.
Edit: To add, this discussion becomes pure semantics. On one side is a strict definition of AGI, on the other side are the most generalized definitions of artificial intelligence. It gets kind of silly because technically, every "if" statement is a type of "AI" by the loosest definitions.
lb4r|2 years ago
Which is why I find it strange that he takes it upon himself to proclaim in a definitive manner that LLMs are not intelligent, and not "by any stretch."
add-sub-mul-div|2 years ago
taneq|2 years ago
ben_w|2 years ago
Compare "Intelligent Design" vs. the use of genetic algorithms in AI. Simple forms of intelligence can get you a long way and can seem very impressive, especially if they have a lot of subjective experiences, which DNA gets from deep time and which AI gets from transistors outpacing synapses by the ratio to which a pack of wolves outpace continental drift.
lb4r|2 years ago
password54321|2 years ago
LLMs are lacking in fluid intelligence and there is even a good benchmark for it called the abstract reasoning corpus.
dkjaudyeqooe|2 years ago
lb4r|2 years ago
unknown|2 years ago
[deleted]