top | item 47050971

(no title)

pkaye | 13 days ago

Above 200k token context they charge a premium. I think its $10/M tokens of input.

discuss

order

_ink_|13 days ago

Interesting. Is it because they can or is it really more expensive for them to process bigger context?

cube2222|13 days ago

Attention is, at its core, quadratic wrt context length. So I'd believe that to be the case, yeah.

pkaye|13 days ago

I've read that compute costs for LLMs go up O(n^2) with context window size. But I think it is also a combination of limited compute availability, users preference for Anthropic models and Anthropic planning to go IPO.