I indexed a few comments from the other post and put this to the AI itself:
pastel-mature-herring~> The user is questioning the validity of the statement that "cognition requires a grounded representation of the world that is not composed of abstract symbols; a simulation provides just such a model." I want you, nickel-scorpion, to answer the question as you see it.
nickel-scorpion|> There is no one answer to this question. It depends on how you define "cognition" and "abstract symbols." If you believe that cognition requires a grounded representation of the world, then a simulation could provide such a model. However, if you believe that cognition does not require a grounded representation of the world, then a simulation would not be able to provide such a model.
Honestly, yes, I don’t think it’s that far off. LLMs are a series of relatively simple transformers chained together. Through which we can simulate thought to the point that it not only passes the Turing test but it’s useful.
This is a bit of extrapolation but I would say the reason why we’ve been unable to locate “consciousness” in the brain is because it’s the same thing. Relatively simple neurones, chained together, to create thought.
On a philosophical level: this doesn’t make any claims for idealism or materialism, “experience” could exist at a more fundamental level of reality than matter. But IMO that would mean that the LLM is “experiencing” as well.
Either this is a missing layer, and we’ll get there soon, or one could say verification is happening through the statistical model. We need to see if we can train hallucinations out by having it rely on a stable data store to retrieve facts rather than trying to pull facts from the model itself. This is still similar to how the brain has discrete components for memory storage and retrieval.
delusional|3 years ago
wvoch235|3 years ago
Retric|3 years ago
Because language models like ChatGPT have significant limitations you don’t. Most notably when it comes to novel stimuli.
kordlessagain|3 years ago
pastel-mature-herring~> The user is questioning the validity of the statement that "cognition requires a grounded representation of the world that is not composed of abstract symbols; a simulation provides just such a model." I want you, nickel-scorpion, to answer the question as you see it.
nickel-scorpion|> There is no one answer to this question. It depends on how you define "cognition" and "abstract symbols." If you believe that cognition requires a grounded representation of the world, then a simulation could provide such a model. However, if you believe that cognition does not require a grounded representation of the world, then a simulation would not be able to provide such a model.
wvoch235|3 years ago
This is a bit of extrapolation but I would say the reason why we’ve been unable to locate “consciousness” in the brain is because it’s the same thing. Relatively simple neurones, chained together, to create thought.
On a philosophical level: this doesn’t make any claims for idealism or materialism, “experience” could exist at a more fundamental level of reality than matter. But IMO that would mean that the LLM is “experiencing” as well.
mdp2021|3 years ago
wvoch235|3 years ago