(no title)
chordalkeyboard | 2 years ago
The experience of being human is what allows me to infer meaning from the words, phrases, sentences, etc. that a human generates. This is what allows me to make the leap from text to understanding (or lack, or incomplete understanding, or confusion, or deception) with human-generated responses. This is what I have in that case of humans, which allows me to interpret their statements one way; and what I lack with machines, which means I have no basis for inferring understanding the same way I do with a human. If I was not human, I would not be able to infer meaning from the noises a human makes, except by observing correlations between those noises and their behavior. This is well understood in cognitive science and animal behavior.
> To understand something means that one can create novel answers to questions about something. Those answers however must make sense with the rules that govern the "something" at hand. This answer must also not be "memorized" in some sort of giant query-response lookup table.
chatGPT is functionally equivalent to a lookup table with randomization.
> For example if I ask chatGPT to emulate a bash terminal and create a new directory it can do so indicating it understands how a filesystem works. That is understanding.
It replies with a text output that is a probabilistic representation of the text that one might find on the internet in response to such a query. The emulation occurs in your mind when you read the response and assign meaning to the words and phrases it contains.
> However understanding things is an aspect of being human and chatGPT captures a part of that aspect.
You have not shown that chatGPT is anything different than a fancy lookup table with some randomization.
No comments yet.