top | item 47136446

(no title)

sjaiisba | 6 days ago

All your competitors benefit from your training costs. They’ll lose on inference pretty quickly if they stop training new models, no?

discuss

order

rustyhancock|6 days ago

I don't think they will lose on inference because that assumes that compute becomes cheap for all evenly.

Their spending today has secured their compute for the near future.

If every GPU, stick for RAM and SSD is already paid for. Who can afford to sell cheap inference?

Z.ai is trying to deal with this by using domestic (basically Huwawei silicon not Nvidia). And with their state subsidy they will do well.

Anthropic has a 50bn USD plan to build data centres for 2026.

OpenAI similarly has secured extraordinary amounts of other people's money for data centres.

All these will be sunk costs and "other people's money" while money is easy to get hold off. But will be a moat when R&D ends.

Once all the models become basically the same who you go with will be who you're already with (mostly OpenAI), and who you end up with (say people who use Gemini because they have a Google 2TB account).

Some upstart can put themselves into the ground borrowing compute and selling at a loss but the moment they catch up and need to raise prices everyone will simply leave.

ChatGPT is what is most likely to remain a sustained frontier model. Maybe Claude jumps ahead further a few times, Gemini will have its moment. But it'll all be a wash with ChatGPT tittering along as rarely the best. But never the worst.

ethbr1|6 days ago

> Once all the models become basically the same who you go with will be who you're already with (mostly OpenAI)

Imho, people are undervaluing the last mile connection to the customer.

The last Western megacorp to bootstrap its way there was Facebook, and control over cloud identity and data was much less centralized circa-late-00s.

The real clock OpenAI is running against is creating a durable consumer last-mile connection (killer app, device, etc).

"Easy to use chat app / coding tool" doesn't even begin to approach the durability of Microsoft, Apple, Google, or Meta. And without it, OpenAI risks any one of them pulling an Apple Maps at any time.

Unless it continually plows money into R&D to maintain the lead and doesn't pull an Intel and miss a beat.

Maybe they do, but that's a lot of coin flips that need to continually come up heads, in perpetuity.