top | item 40923480

(no title)

charlescurt123 | 1 year ago

Doing any job for more than an hour without completely forgetting it's goals and tasks

discuss

order

sberens|1 year ago

How long do you expect LLMs/agents to be unable to do this?

charlescurt123|1 year ago

Good question, I'm working on exactly this, I suppose you could call it the replacement of RAG.

It's actually not very easy to achieve this. I could give a very long winded answer (don't tempt me) but suffice to say it's a resolution problem.

All AI have a fixed resolution on creation. Long running tasks focus on a very particular narrowing space per step, the resolution required for an infinite task is infinite resolution.

No 9s of error will ever fix this.

Funny enough, small animals do this with ease so I strongly disagree the idea that our AI outcompete even small mammals in every way.

nvy|1 year ago

Personally, I think that phenomenon (along with "hallucinations") is fundamentally baked into LLMs writ large.

I think LLMs are a dead end on the path to AGI.