Completely agree. The System 1/System 2 distinction seems relevant here. As powerful as transformers are with just next-token generation and context, which can be hacked to form a sort of short-term memory, some time of real-time learning + long-term memory storage seems like an important research direction.
No comments yet.