top | item 47190927

(no title)

Aditya_Garg | 3 days ago

This is a common misconception

OpenAI and others are already profitable on inference (inference is really really cheap)

They are just heavily investing into the latest frontier

The biggest risk is whether they can stay cutting edge, or if open source or others will catch up quickly.

discuss

order

lelanthran|3 days ago

> OpenAI and others are already profitable on inference (inference is really really cheap)

If it's that cheap I'll soon be doing it self-hosted, or switching to a local provider.

It's a race to the bottom for tokens-providers.

jychang|3 days ago

It is that cheap. Look at Deepseek or GLM pricing.

vasco|3 days ago

If you need to do the latter to be able to make money on the former, then you're not making money. Because if the latter requirement would disappear, inference margins would also drop.

parineum|3 days ago

At the end of the day, they're still burning cash. Even if inference is cheap, it's also not hard to compete on. They aren't going to be a trillion dollar inference company.

Eventually there will be a race to the bottom on inference price to the customer by companies that aren't trying to subsidize their GPU investments.

OpenAI is spending money because they think they need to for their business to survive. They're hoping that the next big breakthrough just requires more compute and, somehow, that'll build them a moat.

zaphar|3 days ago

OpenAI and quite honestly the others think they are in a race to AGI not the bottom. That's why they aren't concerning themselves with moats or cost. This is quite simply a massive bet that we've already cracked AGI and the rest is just funding the engineering to make it happen.

I personally think we haven't cracked AGI yet but it doesn't change their calculus.

rasz|3 days ago

>inference is really really cheap

cough Sora cough