The keyword of that study is consciousness, which I'd consider a separate goal than an "intelligence". LLM proponents are aware that their architecture lacks many parts of what constitutes a complete brain, and there's other AI researchers who disagree that LLMs will lead to either AGI or consciousness. I largely consider these tangential to the topic. A neural net simulation of a virtual reality does not need consciousness as it has to model the consequences of agentic actions.
Marshferm|3 months ago
“We refute (based on empirical evidence) claims that humans use linguistic representations to think.” Ev Fedorenko Language Lab MIT
CrackerNews|3 months ago