(no title)
ninjanomnom | 1 year ago
Honestly the more I see from LLMs the more it makes me think that we are much the same. Imagine a network of many different LLMs each given different capabilities and prompts, each able to communicate with others. Now imagine splitting this network in half, wouldn't the resulting adjustment look similar to a split brain patient in humans? Are "you" possibly just the LLM that has been given control over the speech and other intentional body actions?
verisimi|1 year ago
I don't see how that can be. You have no experience of anyone else subjectively. For all you know, you could be a 'brain in a jar' with inputs coming in via wires, other people could be part of the illusion. The only thing you can say you know is your own present experience - it is infinitely special. Sure, when you consider yourself objectively, you appear to be like one of so many people. But, there's no getting away from the fact your own experience is infinitely special to you. And it has meaning/value to you because of emotions. Animals seem to, but these are not at the same level. Machines give no indication of emotions. AIs appear to have emotions, but this is a simulation/illusion - like those medieval wooden toys, but better.
> I can understand people entirely different to myself so I can empathize and work with them.
This is a form of projection, imo. You have no idea about what is going on inside others. You only have appearance to guide you.
> Honestly the more I see from LLMs the more it makes me think that we are much the same.
If you take a purely objective, materialistic viewpoint, why not consider it the other way - that we ourselves are a sort of AI, our hardware being wetware/bodies - just mechanistic. Do you think you are an AI? (As was illustrated in Westworld?)
ben_w|1 year ago
Is it "projection" to think we have insight into other minds? Sure, but if you do that projection by default then you won't think that you're special — to project like that is to be absent that thought.
> And it has meaning/value to you because of emotions. Animals seem to, but these are not at the same level. Machines give no indication of emotions. AIs appear to have emotions, but this is a simulation/illusion - like those medieval wooden toys, but better.
Likewise for this: there are many cases where humans project meaning/value onto even inanimate things, or even onto what we imagine other people might be doing — this is why "blasphemy" is a concept and not just a string of letters/phonemes, and also why people with real and violent actions to propaganda that demonises some minority or individual.
Do any AI have emotions[0], or are LLMs merely good at pretending? Both can be true at the same time (LLMs being only one of many kinds of AI), or exclusively one, or neither.
[0] regardless of if this is meant as "the qualia of emotions" or "something which has a functional influence on the network's output that is similar to the influence on human brains of hormones associated with emotions"