top | item 47155434

(no title)

throw310822 | 4 days ago

> in a weird way that only becomes instantiated in the brief period while the model is predicting tokens

Makes sense, but at the same time: subjectively, an LLM is always predicting tokens. Otherwise it's just frozen.

discuss

order

Trasmatta|4 days ago

Yeah, a sci-fi analogy might be one where you keep getting cloned with all of your memories intact and then vaporized shortly after. Each instantiation of "you" feels a continuous existence, but it's an illusion.

(Some might argue that's basically the human experience anyway, in the Buddhist non self perspective - you're constantly changing and being reified in each moment, it's not actually continuous)

throw310822|4 days ago

Or simply be constantly hibernated and de-hibernated. Or, if your brain is simulated, the time between the ticks.

My mental image, though, is that LLMs do have an internal state that is longer lived than token prediction. The prompt determines it entirely, but adding tokens to the prompt only modifies it slightly- so in fact it's a continuously evolving "mental state" influenced by a feedback loop that (unfortunately) has to pass through language.