top | item 45596191

(no title)

leetharris | 4 months ago

The main thing holding these Anthropic models back is context size. Yes, quality deteriorates over a large context window, but for some applications, that is fine. My company is using grok4-fast, the Gemini family, and GPT4.1 exclusively at this point for a lot of operations just due to the huge 1m+ context.

discuss

order

Tiberium|4 months ago

Is your company Tier 4? Anthropic has had 1M context size in beta for some time now.

https://docs.claude.com/en/docs/build-with-claude/context-wi...

leetharris|4 months ago

Only for Sonnet. No 1m for Haiku (this new model) and Opus.

This means 2.5 Flash or Grok 4 fast takes all the low end business for large context needs.

_ink_|4 months ago

Is it possible to get that in Claude Code with Pro? Or is it already a 1M context window?