top | item 47131495

(no title)

boxedemp | 6 days ago

I've literally not met one person in tech who thinks LLMs will become sentient or conscious. But I always see people online claiming that there are lots of people who believe that.

Where are they?

Are we sure that's not a misunderstanding of the terminology? Artificial diamonds, such as cubic zirconia, are not diamonds, and nobody thinks they are. 'Artificial' means it's not the real thing. When will conscious, actual intelligence be called 'synthetic intelligence' instead of 'artificial'?

Incidentally, this comment was written by AI.

discuss

order

grogers|6 days ago

It's not your main point, but I can't help but point out that artificial diamonds ARE diamonds. Cubic zirconia is a different mineral. Usually the distinction is "natural" vs "lab grown" diamonds.

When computers have super-human level intelligence, we might be making similar distinctions. Intelligence IS intelligence, whether it's from a machine or an organism. LLMs might not get us there but something machine will eventually.

maest|6 days ago

I agree, but as a nit, the industry uses "earth mined" instead of "natural", presumably because it's more precise (and maybe less normative?)

windows2020|5 days ago

Well, unless intellect is immaterial.

andai|6 days ago

Interesting. Artificial does have a negative connotation to it, I never considered that.

Synthetic sounds more neutral, aside from bringing microplastics to my mind.

I guess the field of artificial life has the same issue.

As another comment pointed out, you don't necessarily need consciousness for intelligence. And you don't need either of those for goal oriented behavior.

My favorite example is the humble refrigerator. (The old one, without the microchips!) It has a goal (target temperature), it senses its environment (current temperature), and takes action based on that (turn cooling on or off).

A cuter example is the dandelion seed. It "wants" to fly. Obviously! So you can display goal directed behavior as the result of natural forces moving through you. (Arguably electricity and glucose also fall in that category, but... Yeah...)

LLMs, conscious or not, moved into that category this year, in a big way. (e.g. Opus and Codex routinely bypassing security restrictions in the pursuit of the goal.)

Does it really have goals, or does it merely appear to act as though it has them? Does it appear to act as though it has consciousness?

(I forget who said it: it won't really disrupt the global economic system, it will merely appear to do so ;)

Also, here I am! :)

palmotea|5 days ago

> I've literally not met one person in tech who thinks LLMs will become sentient or conscious. But I always see people online claiming that there are lots of people who believe that.

I haven't met him, but a famous (pre-ChatGPT) counterexample is Blake Lemoine:

> In June 2022, LaMDA gained widespread attention when Google engineer Blake Lemoine made claims that the chatbot had become sentient. (https://en.wikipedia.org/wiki/LaMDA).

It's also not uncommon here to see someone respond to a comment questioning the consciousness or sentience of LLMs with the question along the lines of "how do you know anyone is conscious/sentient?" They're not being direct with their beliefs (I believe as a kind of motte and bailey tactic), but the implication is they think LLM are sentient and bristle when someone suggests otherwise.

sshine|5 days ago

> When will conscious, actual intelligence be called 'synthetic intelligence' instead of 'artificial'?

One can bypass the whole sentience discussion and say that AI stands for Automated Inference.

If actual, conscious intelligence were to manifest synthetically, as in silicon-based rather than carbon-based, it is a losing battle to convince people because of the philosophical “problem of other minds.”

If there is a functional equivalence between meatspace intelligence and synthetic, it will surely have enough value to reinforce itself, philosophical debates aside.

tim333|5 days ago

AI becoming conscious is different to LLMs doing so. Maybe more people are claiming that? I think AI will but LLMs won't.

It depends a bit what you mean by conscious but assuming it's human like then it incorporates a lot of feelings, vision, sound, thoughts and the like, things that are not language really. But we do it with neurons and some chemicals and I imagine you could do something like that with artificial neural networks and some computer version of the chemistry, but not just language really.

mullingitover|6 days ago

> LLMs will become sentient or conscious

I've always doubted it, but then again I've also been skeptical about claims that humans have these capabilities.

rickydroll|5 days ago

An interesting parallel would be to look at what it took for humans to accept that sapience existed in non-humans, especially non-human primates.

On terminology, I would argue for non-biological intelligence. People can be awfully bioist (biological racist).

jamesfinlayson|6 days ago

> But I always see people online claiming that there are lots of people who believe that.

I saw someone on the news claiming this recently, but he ran an AI consultancy firm so I suspect he was trying to drum up business.

melagonster|6 days ago

>LLMs will become sentient or conscious.

People who declare that AGI is coming.

mattclarkdotnet|6 days ago

AGI is completely orthogonal to consciousness. Crows seem pretty conscious to me, as does my cat, but I have no way to test or prove it. They are intelligent though.

mattclarkdotnet|6 days ago

What? Nobody says cubic zirconia is an artificial diamond, it’s just a different shiny crystal. We have loads of actual artificial diamonds, so cheap you can get a cutting disc made fr9m them for $10 at home depot.

And nobody working in the space either as ML/AI practitioners, or as philosophers, or as cognitive scientists, even thinks we know what consciousness is, or what is required to create it. So there would be no way to tell if an AI is conscious because we haven’t yet managed to reliably tell if humans, or dogs, or chimpanzees or whales are conscious.

The claim that is often made is that more work on the current generation of AI tech will lead to AGI at a human or better level. I agree with Yann Lecun that this is unlikely.

WalterBright|6 days ago

I'm pretty sure mammals and birds are conscious. Insects, probably not.

pllbnk|5 days ago

Lucky you. I have personally faced some cargo cult-like behavior.