(no title)
empyrrhicist | 24 days ago
Must it? I fail to see why it "must" be... anything. Dumping tokens into a pile of linear algebra doesn't magically create sentience.
empyrrhicist | 24 days ago
Must it? I fail to see why it "must" be... anything. Dumping tokens into a pile of linear algebra doesn't magically create sentience.
ben_w|24 days ago
More precisely: we don't know which linear algebra in particular magically creates sentience.
Whole universe appears to follow laws that can be written as linear algebra. Our brains are sometimes conscious and aware of their own thoughts, other times they're asleep, and we don't know why we sleep.
habinero|24 days ago
"This statistical model is governed by physics": true
"This statistical model is like our brain": what? no
You don't gotta believe in magic or souls or whatever to know that brains are much much much much much much much much more complex than a pile of statistics. This is like saying "oh we'll just put AI data centers on the moon". You people have zero sense of scale lol
judahmeek|24 days ago
Garbage collection, for one thing. Transfer from short-term to long-term memory is another. There's undoubtedly more processes optimized for or through sleep.
empyrrhicist|23 days ago
Seriously - the language used is a wild claim in the context.
nhecker|24 days ago
empyrrhicist|23 days ago
tines|24 days ago
empyrrhicist|23 days ago
My objection was:
1. I don't personally think anything similar is happening right now with LLMs. 2. I object to the OP's implication that it is obvious such a phenomenon is occurring.
pixl97|24 days ago
Your response is at the level of a thought terminating cliche. You gain no insight on the operation of the machine with your line of thought. You can't make future predictions on behavior. You can't make sense of past responses.
It's even funnier in the sense of humans and feeling wetness... you don't. You only feel temperature change.