I think LLMs are presenting some uncomfortable philosophical questions for people about how our own brains work and admitting that there is any kind of "intelligence" (even if very basic) in an LLM is an admission that our own brains may work in a similar manner.
YeGoblynQueenne|2 years ago
For instance -I'm not trying to be mean and I'm certainly not blaming you in particular, because I've seen this very often- but the reasoning that because LLMs can generate language, and humans can generate language not only LLMs are somehow like humans but also humans are like LLMs is not sound.
For example, walls have ears, cats have ears, therefore walls are like cats and cats are like walls. That doesn't work because walls' ears are not like cats' ears and even if they were, that still wouldn't make walls cats and cats walls, it would just make them both entities with ears.
corethree|2 years ago
Nah. Nobody personifies LLMs like this. What you're laying out here is a fundamental mistake that you'd have to be extremely stupid to make. I think barely anyone is making this mistake to even qualify mentioning it.
Seriously who here things that LLMs are anything like humans? That is not the claim. The claim is that LLMs understand you. Intelligence and understanding are clearly orthogonal to "human-like"
famouswaffles|2 years ago
There is no evidence, basically none whatsoever that general "perfect logical reasoning" is a thing that actually exists in the real world. None.
No animal we've observed does it. Humans certainly don't do it. The only realm this idea actually works is Fiction. and this was not like for a lack of trying. Some of the greatest minds worked on this for decades and some people still don't seem to get it. Logic doesn't scale. They break at real world relationships.
Logic systems are that guy in the stands yelling that he could've made the shot, while he's not even on the field.
andrewguenther|2 years ago
corethree|2 years ago