top | item 45918963

(no title)

CrackerNews | 3 months ago

The keyword of that study is consciousness, which I'd consider a separate goal than an "intelligence". LLM proponents are aware that their architecture lacks many parts of what constitutes a complete brain, and there's other AI researchers who disagree that LLMs will lead to either AGI or consciousness. I largely consider these tangential to the topic. A neural net simulation of a virtual reality does not need consciousness as it has to model the consequences of agentic actions.

discuss

order

Marshferm|3 months ago

It’s not a keyword, it’s the seat of intelligence. What coders don’t grasp is nothing g related to symbols metaphors words language manifests as consciousness and or intel. Your field is a wash.

“We refute (based on empirical evidence) claims that humans use linguistic representations to think.” Ev Fedorenko Language Lab MIT

CrackerNews|3 months ago

When I look up that quote, it leads back to Hacker News comments. It is also a strange way to make a citation. You make blanket statements that are easily argued against, and now you respond with this nonsense. I accuse you of being an LLM bot.