top | item 46907765

(no title)

tumdum_ | 24 days ago

> On top of that, Anthropic is losing money on it.

It seems they are *not* losing money on inference: https://bsky.app/profile/steveklabnik.com/post/3mdirf7tj5s2e

discuss

order

byzantinegene|23 days ago

no, and that is widely known. the actual problem is that the margins are not sufficient at that scale to make up for the gargantuan training costs to train their SOTA model.

JamesBarney|20 days ago

They are large enough to cover their previous training costs but not their next gen training costs.

i.e They made more money on 3.5 than 3.5 cost to train, but didn't make enough money on 3.5 to train 4.0.

aurareturn|23 days ago

Source on that?

Because inference revenue is outpacing training cost based on OpenAI’s report and intuition.

quikoa|23 days ago

That's for the API right? The subs are still a loss. I don't know which one of the two is larger.