top | item 45550081

(no title)

kcartlidge | 4 months ago

Why are we hearing that "studies" have "uncovered the concept of context rot as the number of tokens in the context window increases"? It's obvious, and we've always known this.

Agents are stateless, hence the need for context. This means that all they know about the ongoing session is what's in that context (generally speaking). As the context grows any particular element within it becomes a smaller and smaller percentage of the whole. The LLM is not 'losing focus'; it's being diluted with more tokens. But then I suppose anthropomorphism comes naturally to a company named Anthropic, and 'losing focus' does make it sound more human.

They didn't need a study and article, but it likely contributes towards the mystique. Hence the use of phrases like "this results in n² pairwise relationships for n tokens" to make it sound more erudite and revelatory.

discuss

order

No comments yet.