top | item 46864545

(no title)

NetworkPerson | 28 days ago

I found it far too expensive for Anthropic. Entire context of every conversation is sent each time you type anything. Switched to a local model running from Ollama. Not quite as smart as opus, but good enough for my needs.

discuss

order

CjHuber|27 days ago

Does it not use prompt caching?