(no title)
Wheatman | 1 year ago
Unless OpenAI is willing to do something desperate, the best they have right now is what llm are, and so the cost would be in maintaing them. If you already paid for a bunch of H100's to train, there is little incentive to move away unless you know TPU are going to be significantly cheaper to run, cheap enough to explain the new cost of buying them.
This is ignoring the giant bubble that has balooned out of AI hype, which if popped would be disastorous for the comapnies most invested in the industry. Nvidia has a P/E ratio of 60-70, if they dont get enough future growth to explain it, they could lose a third of their pricing if not more.
jasfi|1 year ago
There's also lots of utility to be found with the best LLMs today. I'm working on something myself, and have seen others pushing the boundaries in hackathons and startups. So that's a lot of innovation and value that's definitely not a bubble.