top | item 46695727

(no title)

agentcoops | 1 month ago

I hear your argument, but short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon. Of course I could easily be wrong, but regardless I think the most predictable cause for a drop in the NVIDIA price would be that the CHIPS act/recent decisions by the CCP leads a Chinese firm to bring to market a CUDA compatible and reliable GPU at a fraction of the cost. It should be remembered that NVIDIA's /current/ value is based on their being locked out of their second largest market (China) with no investor expectation of that changing in the future. Given the current geopolitical landscape, in the hypothetical case where a Chinese firm markets such a chip we should expect that US firms would be prohibited from purchasing them, while it's less clear that Europeans or Saudis would be. Even so, if NVIDIA were not to lower their prices at all, US firms would be at a tremendous cost disadvantage while their competitors would no longer have one with respect to compute.

All hypothetical, of course, but to me that's the most convincing bear case I've heard for NVIDIA.

discuss

order

reppap|1 month ago

People will want more GPUs but will they be able to fund them? At what points does the venture capital and loans run out? People will not keep pouring hundreds of billions into this if the returns don't start coming.

gadflyinyoureye|1 month ago

Money will be interesting the next few years.

There is a real chance that the Japanese carry trade will close soon the BoJ seeing rates move up to 4%. This means liquidity will drain from the US markets back into Japan. On the US side there is going to be a lot of inflation between money printing, refund checks, amortization changes and a possible war footing. Who knows?

tracker1|1 month ago

Doesn't even necessarily need to be CUDA compatible... there's OpenCL and Vulkan as well, and likely China will throw enough resources at the problem to bring various libraries into closer alignment to ease of use/development.

I do think China is still 3-5 years from being really competitive, but still even if they hit 40-50% of NVidia, depending on pricing and energy costs, it could still make significant inroads with legal pressure/bans, etc.

bigyabai|1 month ago

> there's OpenCL and Vulkan as well

OpenCL is chronically undermaintained & undersupported, and Vulkan only covers a small subset of what CUDA does so far. Neither has the full support of the tech industry (though both are supported by Nvidia, ironically).

It feels like nobody in the industry wants to beat Nvidia badly enough, yet. Apple and AMD are trying to supplement raster hardware with inference silicon; both of them are afraid to implement a holistic compute architecture a-la CUDA. Intel is reinventing the wheel with OneAPI, Microsoft is doing the same with ONNX, Google ships generic software and withholds their bespoke hardware, and Meta is asleep at the wheel. All of them hate each other, none of them trust Khronos anymore, and the value of a CUDA replacement has ballooned to the point that greed might be their only motivator.

I've wanted a proper, industry-spanning CUDA competitor since high school. I'm beginning to realize it probably won't happen within my lifetime.

laughing_man|1 month ago

I suspect major algorithmic breakthroughs would accelerate the demand for GPUs instead of making it fall off, since the cost to apply LLMs would go down.

nroets|1 month ago

Some changes to the algorithms and implementations will allow cheaper commodity hardware to be used.

iLoveOncall|1 month ago

> short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon

Or, you know, when LLMs don't pay off.

unsupp0rted|1 month ago

Even if LLMs didn't advance at all from this point onward, there's still loads of productive work that could be optimized / fully automated by them, at no worse output quality than the low-skilled humans we're currently throwing at that work.

stingraycharles|1 month ago

Exactly, the current spend on LLMs is based on extremely high expectations and the vendors operating at a loss. It’s very reasonable to assume that those expectations will not be met, and spending will slow down as well.

Nvidia’s valuation is based on the current trend continuing and even increasing, which I consider unlikely in the long term.

MichaelRo|1 month ago

> short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon

>> Or, you know, when LLMs don't pay off.

Heh, exactly the observation that a fanatic religious believer cannot possibly foresee. "We need more churches! More priests! Until a breakthrough in praying technique will be achieved I don't foresee less demand for religious devotion!" Nobody foresaw Nietzsche and the decline in blind faith.

But then again, like an atheist back in the day, the furious zealots would burn me at the stake if they could, for saying this. Sadly no longer possible so let them downvotes pour instead!

selfhoster11|1 month ago

They already are paying off. The nature of LLMs means that they will require expensive, fast hardware that's a large capex.

kelseyfrog|1 month ago

Algorithmic breakthroughs (increases in efficiency) risk Jevons Paradox. More efficient processes make deploying them even more cost effective and increases demand.