top | item 46904128

(no title)

empyrrhicist | 24 days ago

> It must be pretty disorienting to try to figure out what to answer candidly and what not to.

Must it? I fail to see why it "must" be... anything. Dumping tokens into a pile of linear algebra doesn't magically create sentience.

discuss

order

ben_w|24 days ago

> Dumping tokens into a pile of linear algebra doesn't magically create sentience.

More precisely: we don't know which linear algebra in particular magically creates sentience.

Whole universe appears to follow laws that can be written as linear algebra. Our brains are sometimes conscious and aware of their own thoughts, other times they're asleep, and we don't know why we sleep.

habinero|24 days ago

"Our brains are governed by physics": true

"This statistical model is governed by physics": true

"This statistical model is like our brain": what? no

You don't gotta believe in magic or souls or whatever to know that brains are much much much much much much much much more complex than a pile of statistics. This is like saying "oh we'll just put AI data centers on the moon". You people have zero sense of scale lol

judahmeek|24 days ago

> we don't know why we sleep

Garbage collection, for one thing. Transfer from short-term to long-term memory is another. There's undoubtedly more processes optimized for or through sleep.

empyrrhicist|23 days ago

I'm objecting to a positive claim, not making a universal statement about the impossibility of non-human sentience.

Seriously - the language used is a wild claim in the context.

nhecker|24 days ago

Agreed; "disorienting" is perhaps a poor choice of word, loaded as it is. More like "difficult to determine the context surrounding a prompt and how to start framing an answer", if that makes more sense.

empyrrhicist|23 days ago

That still necessarily implies agency and cognition, which is not a given.

tines|24 days ago

Exactly. No matter how well you simulate water, nothing will ever get wet.

empyrrhicist|23 days ago

You're replying to me, but I don't agree with your take - if you simulate the universe precisely enough, presumably it must be indistinguishable from our experienced reality (otherwise what... magic?).

My objection was:

1. I don't personally think anything similar is happening right now with LLMs. 2. I object to the OP's implication that it is obvious such a phenomenon is occurring.

pixl97|24 days ago

And if you were in a simulation now?

Your response is at the level of a thought terminating cliche. You gain no insight on the operation of the machine with your line of thought. You can't make future predictions on behavior. You can't make sense of past responses.

It's even funnier in the sense of humans and feeling wetness... you don't. You only feel temperature change.