top | item 38324689

(no title)

bytefactory | 2 years ago

Completely agree. The System 1/System 2 distinction seems relevant here. As powerful as transformers are with just next-token generation and context, which can be hacked to form a sort of short-term memory, some time of real-time learning + long-term memory storage seems like an important research direction.

discuss

order

No comments yet.