top | item 45718929

(no title)

frotaur | 4 months ago

This barrier is trivial to solve even today. It is not hard to put an LLM on an infinite loop of self-prompting.

discuss

order

rkomorn|4 months ago

A self-prompting loop still seems artificial to me. It only exists because you force it to externally.

frotaur|4 months ago

You only exist because you were forced to be birthed externally? Everything has a beginning.

In fact, what is artificial is stopping the generation of an LLM when it reaches a 'stop token'.

A more natural barrier is the attention size, but with 2 million tokens, LLMs can think for a long time without losing any context. And you can take over with memory tools for longer horizon tasks.