top | item 39858274

(no title)

kungito | 1 year ago

I'm one of those people. To me those things only sounded like a different prompt. Priorities set for the llm

discuss

order

goatlover|1 year ago

Isn’t that taken the analogy too literally? You’re saying nature is promoting humans to generate the next token to be outputted? What about all the other organisms that don’t have language? How do you distinguish nature prompts from nature training datasets? What makes you think nature is tokenized? What makes you think language generation is fundamental to biology?

TRDRVR|1 year ago

Here's the hubris of thinking that way:

I would imagine the baseline assumption of your thinking is that things like sleep and emotions are a 'bug' in terms of cognition (or at the very least, 'prompts' that are optional).

Said differently, the assumption is that with the right engineer, you could reach human-parity cognition with a model that doesn't sleep or feel emotions (after all what's the point of an LLM if it gets tired and doesn't want to answer your questions sometimes? Or even worse knowingly deceives you because it is mad at you or prejudiced against you).

The problem with that assumption is that as far as we can tell, every being with even the slightest amount of cognition sleeps in some form and has something akin to emotional states. As far as we can prove, sleep and emotions are necessary preconditions to cognition.

A worldview where the 'good' parts of the brain (reasoning and logic) are replicated in LLM but the 'bad' parts (sleep, hunger, emotions, etc.) are not is likely an incomplete model.

kelseyfrog|1 year ago

Do airplanes need sleep because they fly like birds who also require sleep?