top | item 46693996

(no title)

_fat_santa | 1 month ago

This article goes more into the technical analysis of the stock rather than the underlying business fundamentals that would lead to a stock dump.

My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.

The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.

discuss

order

agentcoops|1 month ago

I hear your argument, but short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon. Of course I could easily be wrong, but regardless I think the most predictable cause for a drop in the NVIDIA price would be that the CHIPS act/recent decisions by the CCP leads a Chinese firm to bring to market a CUDA compatible and reliable GPU at a fraction of the cost. It should be remembered that NVIDIA's /current/ value is based on their being locked out of their second largest market (China) with no investor expectation of that changing in the future. Given the current geopolitical landscape, in the hypothetical case where a Chinese firm markets such a chip we should expect that US firms would be prohibited from purchasing them, while it's less clear that Europeans or Saudis would be. Even so, if NVIDIA were not to lower their prices at all, US firms would be at a tremendous cost disadvantage while their competitors would no longer have one with respect to compute.

All hypothetical, of course, but to me that's the most convincing bear case I've heard for NVIDIA.

reppap|1 month ago

People will want more GPUs but will they be able to fund them? At what points does the venture capital and loans run out? People will not keep pouring hundreds of billions into this if the returns don't start coming.

tracker1|1 month ago

Doesn't even necessarily need to be CUDA compatible... there's OpenCL and Vulkan as well, and likely China will throw enough resources at the problem to bring various libraries into closer alignment to ease of use/development.

I do think China is still 3-5 years from being really competitive, but still even if they hit 40-50% of NVidia, depending on pricing and energy costs, it could still make significant inroads with legal pressure/bans, etc.

laughing_man|1 month ago

I suspect major algorithmic breakthroughs would accelerate the demand for GPUs instead of making it fall off, since the cost to apply LLMs would go down.

iLoveOncall|1 month ago

> short of major algorithmic breakthroughs I am not convinced the global demand for GPUs will drop any time soon

Or, you know, when LLMs don't pay off.

kelseyfrog|1 month ago

Algorithmic breakthroughs (increases in efficiency) risk Jevons Paradox. More efficient processes make deploying them even more cost effective and increases demand.

lairv|1 month ago

NVIDIA stock tanked in 2025 when people learned that Google used TPUs to train Gemini, which everyone in the community knows since at least 2021. So I think it's very likely that NVIDIA stock could crash for non-rationale reasons

edit: 2025* not 2024

readthenotes1|1 month ago

It also tanked to ~$90 when Trump announced tariffs on all goods for Taiwan except semiconductors.

I don't know if that's non-rational, or if people can't be expected to read the second sentence of an announcement before panicking.

Der_Einzige|1 month ago

Google did not use TPUs for literally every bit of compute that led to Gemini. GCP has millions of high end Nvidia GPUs and programming for them is an order of magnitude easier, even for googlers.

Any claim from google that all of Gemini (including previous experiments) was trained entirely by TPUs is lies. What they are truthfully saying is that the final training run was done on all TPUs. The market shouldn’t react heavily to this, but instead should react positively to the fact that google is now finally selling TPUs externally and their fab yields are better than expected.

mnky9800n|1 month ago

I really don't understand the argument that nvidia GPUs only work for 1-3 years. I am currently using A100s and H100s every day. Those aren't exactly new anymore.

mbrumlow|1 month ago

It’s not that they don’t work. It’s how businesses handle hardware.

I worked at a few data centers on and off in my career. I got lots of hardware for free or on the cheap simply because the hardware was considered “EOL” after about 3 years, often when support contracts with the vendor ends.

There are a few things to consider.

Hardware that ages produce more errors, and those errors cost, one way or another.

Rack space is limited. A perfectly fine machine that consumes 2x the power for half the output cost. It’s cheaper to upgrade a perfectly fine working system simply because it performs better per watt in the same space.

Lastly. There are tax implications in buying new hardware that can often favor replacement.

linkregister|1 month ago

The common factoid raised in financial reports is GPUs used in model training will lose thermal insulation due to their high utilization. The GPUs ostensibly fail. I have heard anecdotal reports of GPUs used for cryptocurrency mining having similar wear patterns.

I have not seen hard data, so this could be an oft-repeated, but false fact.

denimnerd42|1 month ago

1-3 is too short but they aren’t making new A100s, theres 8 in a server and when one goes bad what do you do? you wont be able to renew a support contract. if you wanna diy you eventually you have to start consolidating pick and pulls. maybe the vendors will buy them back from people who want to upgrade and resell them. this is the issue we are seeing with A100s and we are trying to see what our vendor will offer for support.

iancmceachern|1 month ago

They're no longer energy competitive. I.e. the amount of power per compute exceeds what is available now.

It's like if your taxi company bought taxis that were more fuel efficient every year.

mbesto|1 month ago

Not saying your wrong. A few things to consider:

(1) We simply don't know what the useful life is going to be because of how new the advancements of AI focused GPUs used for training and inference.

(2) Warranties and service. Most enterprise hardware has service contracts tied to purchases. I haven't seen anything publicly disclosed about what these contracts look like, but the speculation is that they are much more aggressive (3 years or less) than typical enterprise hardware contracts (Dell, HP, etc.). If it gets past those contracts the extended support contracts can typically get really pricey.

(3) Power efficiency. If new GPUs are more power efficient this could be huge savings on energy that could necessitate upgrades.

swalsh|1 month ago

If power is the bottleneck, it may make business sense to rotate to a GPU that better utilizes the same power if the newer generation gives you a significant advantage.

linuxftw|1 month ago

I think the story is less about the GPUs themselves, and more about the interconnects for building massive GPU clusters. Nvidia just announced a massive switch for linking GPUs inside a rack. So the next couple of generations of GPU clusters will be capable of things that were previously impossible or impractical.

This doesn't mean much for inference, but for training, it is going to be huge.

legitster|1 month ago

From an accounting standpoint, it probably makes sense to have their depreciation be 3 years. But yeah, my understanding is that either they have long service lives, or the customers sell them back to the distributor so they can buy the latest and greatest. (The distributor would sell them as refurbished)

savorypiano|1 month ago

You aren't trying to support ad-based demand like OpenAI is.

nospice|1 month ago

> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down.

Their stock trajectory started with one boom (cryptocurrencies) and then seamlessly progressed to another (AI). You're basically looking at a decade of "number goes up". So yeah, it will probably come down eventually (or the inflation will catch up), but it's a poor argument for betting against them right now.

Meanwhile, the investors who were "wrong" anticipating a cryptocurrency revolution and who bought NVDA have not much to complain about today.

mysteria|1 month ago

Personally I wonder even if the LLM hype dies down we'll get a new boom in terms of AI for robotics and the "digital twin" technology Nvidia has been hyping up to train them. That's going to need GPUs for both the ML component as well as 3D visualization. Robots haven't yet had their SD 1.1 or GPT-3 moment and we're still in the early days of Pythia, GPT-J, AI Dungeon, etc. in LLM speak.

munk-a|1 month ago

That's the rub - it's clearly overvalued and will readjust... the question is when. If you can figure out when precisely then you've won the lottery, for everyone else it's a game of chicken where for "a while" money that you put into it will have a good return. Everyone would love if that lasted forever so there is a strong momentum preventing that market correction.

ericmcer|1 month ago

Crypto & AI can both be linked to part of a broader trend though, that we need processors capable of running compute on massive sets of data quickly. I don't think that will ever go down, whether some new tech emerges or we just continue shoveling LLMs into everything. Imagine the compute needed to allow every person on earth to run a couple million tokens through a model like Anthropic Opus every day.

JakeSc|1 month ago

Agree on looking at the company-behind-the-numbers. Though presumably you're aware of the Efficient Market Hypothesis. Shouldn't "slowed down datacenter growth" be baked into the stock price already?

If I'm understanding your prediction correctly, you're asserting that the market thinks datacenter spending will continue at this pace indefinitely, and you yourself uniquely believe that to be not true. Right? I wonder why the market (including hedge fund analysis _much_ more sophisticated than us) should be so misinformed.

Presumably the market knows that the whole earth can't be covered in datacenters, and thus has baked that into the price, no?

testdelacc1|1 month ago

I saw a $100 bill on the ground. I nearly picked it up before I stopped myself. I realised that if it was a genuine currency note, the Efficient Market would have picked it up already.

matthewdgreen|1 month ago

The EMH does not mean that markets are free of over-investment and asset bubbles, followed by crashes.

TacticalCoder|1 month ago

> This article goes more into the technical analysis of the stock rather than the underlying business fundamentals that would lead to a stock dump. My 30k ft view is that the stock will inevitably slide as AI

Actually "technical analysis" (TA) has a very specific meaning in trading: TA is using past prices, volume of trading and price movements to, hopefully, give probabilities about future price moves.

https://en.wikipedia.org/wiki/Technical_analysis

But TFA doesn't do that at all: it goes in detail into one pricing model formula/method for options pricing. In the typical options pricing model all you're using is current price (of the underlying, say NVDA), strike price (of the option), expiration date, current interest rate and IV (implied volatility: influenced by recent price movements but independently of any technical analysis).

Be it Black-Scholes-Merton (european-style options), Bjerksund-Stensland (american-style options), binomial as in TFA, or other open options pricing model: none of these use technical analysis.

Here's an example (for european-style options) where one can see the parameters:

https://www.mystockoptions.com/black-scholes.cfm

You can literally compute entire options chains with these parameters.

Now it's known for a fact that many professional traders firms have their own options pricing method and shall arb when they think they find incorrectly priced options. I don't know if some use actual so forms of TA that they then mix with options pricing model or not.

> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down.

No matter if you're right or not, I'd argue you're doing what's called fundamental analysis (but I may be wrong).

P.S: I'm not debatting the merits of TA and whether it's reading into tea leaves or not. What I'm saying is that options pricing using the binomial method cannot be called "technical analysis" for TA is something else.

AnotherGoodName|1 month ago

I'll also point out there were insane takes a few years ago before nVidia's run up based on similar technical analysis and very limited scope fundamental analysis.

Technical analysis fails completely when there's an underlying shift that moves the line. You can't look at the past and say "nvidia is clearly overvalued at $10 because it was $3 for years earlier" when they suddenly and repeatedly 10x earnings over many quarters.

I couldn't get through to the idiots on reddit.com/r/stocks about this when there was non-stop negativity on nvidia based on technical analysis and very narrow scoped fundamental analysis. They showed a 12x gain in quarterly earnings at the time but the PE (which looks on past quarters only) was 260x due to this sudden change in earnings and pretty much all of reddit couldn't get past this.

I did well on this yet there were endless posts of "Nvidia is the easiest short ever" when it was ~$40 pre-split.

KeplerBoy|1 month ago

Also there's no way Nvidia's market share isn't shrinking. Especially in inference.

gpapilion|1 month ago

The large api/token providers, and large consumers are all investing in their own hardware. So, they are in an interesting position where the market is growing, and NVIDIA is taking the lion's share of enterprise, but is shrinking at the hyperscaler side (google is a good example as they shift more and more compute to TPU). So, they have a shrinking market share, but its not super visible.

dogma1138|1 month ago

Market share can shrink but if the TAM is growing you can still grow.

blackoil|1 month ago

But will the whole pie grow or shrink?

baxtr|1 month ago

I no AI fanboy at all. I think it there won’t be AGI anytime soon.

However, it’s beyond my comprehension how anyone would think that we will see a decline in demand growth for compute.

AI will conquer the world like software or the smartphone did. It’ll get implemented everywhere, more people will use it. We’re super early in the penetration so far.

Ekaros|1 month ago

At this point computation is in essence commodity. And commodities have demand cycles. If other economic factors slowdown or companies go out of business they stop using compute or start less new products that use compute. Thus it is entirely realistic to me that demand for compute might go down. Or that we are just now over provisioning compute in short or medium term.

marricks|1 month ago

> I no AI fanboy at all.

While thinking computers will replace human brains soon is rabid fanaticism this statement...

> AI will conquer the world like software or the smartphone did.

Also displays a healthy amount of fanaticism.

Ronsenshi|1 month ago

What if its penetration ends up being on the same level as modern crypto? Average person doesn't seem to particularly care about meme coins or bitcoin - it is not being actively used in day to day setting, there's no signs of this status improving.

Doesn't mean that crypto is not being used, of course. Plenty of people do use things like USDT, gamble on bitcoin or try to scam people with new meme coins, but this is far from what crypto enthusiasts and NFT moguls promised us in their feverish posts back in the middle of 2010s.

So imagine that AI is here to stay, but the absolutely unhinged hype train will slow down and we will settle in some kind of equilibrium of practical use.

amelius|1 month ago

> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down.

This is like saying Apple stock will inevitably slide once everybody owns a smartphone.

ramijames|1 month ago

This seems to take for granted that China and their foundries and engineering teams will never catch up. This seems foolish. I'm working under the assumption that sometime in the next ten years some Chinese company will have a breakthrough and either meet Nvidia's level or leapfrog them. Then the market will flood with great, cheap chips.

m12k|1 month ago

I think the way to think about the AI bubble is that we're somewhere in 97-99 right now, heading toward the dotcom crash. The dotcom crash didn't kill the web, it kept growing in the decades that followed, influencing society more and more. But the era where tons of investments were uncritically thrown at anything to do with the web ended with a bang.

When the AI bubble bursts, it won't stop the development of AI as a technology. Or its impact on society. But it will end the era of uncritically throwing investments at anyone that works "AI" into their pitch deck. And so too will it end the era of Nvidia selling pickaxes to the miners and being able to reach soaring heights of profitability born on wings of pretty much all investment capital in the world at the moment.

enos_feedler|1 month ago

Bubble or not it’s simply strange to me that people confidently put a timeline on it. To name the phases of the bubble and calling when they will collapse just seems counter intuitive to what a bubble is. Brad Gerstner was the first “influencer” I heard making these claims of a bubble time line. It just seems downright absurd.

cortesoft|1 month ago

> The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years

Isn’t this entirely dependent on the economic value of the AI workloads? It all depends on whether AI work is more valuable than that cost. I can easily see arguments why it won’t be that valuable, but if it is, then that cost will be sustainable.

alfalfasprout|1 month ago

100% this. all of this spending is predicated on a stratospheric ROI on AI investments at the proposed investment levels. If that doesn't pan out, we'll see a lot of people left holding the cards including chip fabs, designers like Nvidia, and of course anyone that ponied up for that much compute.

richardw|1 month ago

I’m sad about Grok going to them, because the market needs the competition. But ASIC inference seems to require a simpler design than training does, so it’s easier for multiple companies to enter. It seems inevitable that competition emerges. And eg a Chinese company will not be sold to Nvidia.

What’s wrong with this logic? Any insiders willing to weigh in?

bigyabai|1 month ago

I'm not an insider, but ASICs come with their own suite of issues and might be obsolete if a different architecture becomes popular. They'll have a much shorter lifespan than Nvidia hardware in all likelihood, and will probably struggle to find fab capacity that puts them on equal footing in performance. For example, look at the GPU shortage that hit crypto despite hundreds of ASIC designs existing.

The industry badly needs to cooperate on an actual competitor to CUDA, and unfortunately they're more hostile to each other today than they were 10 years ago.

pjmlp|1 month ago

Even though I like CUDA, I think the point is when do compute centers reach the point that they can run their workloads with other vendors, or custom accelerators.

jwoods19|1 month ago

“In a gold rush, sell shovels”… Well, at some point in the gold rush everyone already has their shovels and pickaxes.

krupan|1 month ago

Or people start to realize that the expected gold isn't really there and so stop buying shovels

gopher_space|1 month ago

The version I heard growing up was "In a gold rush, sell eggs."

WalterBright|1 month ago

> technical analysis of the stock

AKA pictures in clouds

throwaway85825|1 month ago

It's not flat growth that's currently priced in, but continuing high growth. Which is impossible.

kqr|1 month ago

Fundamental analysis is great! But I have trouble answering concrete questions of probability with it.

How do you use fundamental analysis to assign a probability to Nvidia closing under $100 this year, and what probability do you assign to that outcome?

I'd love to hear your reasoning around specifics to get better at it.

esafak|1 month ago

Don't you need a model for how people will react to the fundamentals? People set the price.

djeastm|1 month ago

I think the idea of fundamental analysis that you focus on return on equity and see if that valuation is appreciably more than the current price (as opposed to assigning a probability)

cheschire|1 month ago

Well, not to be too egregiously reductive… but when the M2 money supply spiked in the 2020 to 2022 timespan, a lot of new money entered the middle class. That money was then funneled back into the hands of the rich through “inflation”. That left the rich with a lot of spare capital to invest in finding the next boom. Then AI came along.

Once the money dries up, a new bubble will be invented to capture the middle class income, like NFTs and crypto before that, and commissionless stocks, etc etc

It’s not all pump-and-dump. Again, this is a pretty reductive take on market forces. I’m just saying I don’t think it’s quite as unsustainable as you might think.

stego-tech|1 month ago

Add in the fact companies seriously invested in AI (and like workloads typically reliant on GPUs) are also investing more into bespoke accelerators, and the math for nVidia looks particularly grim. Google’s TPUs set them apart from the competition, as does Apple’s NPU; it’s reasonable to assume firms like Anthropic or OpenAI are also investigating or investing into similar hardware accelerators. After all, it’s easier to lock-in customers if your models cannot run on “standard” kit like GPUs and servers, even if it’s also incredibly wasteful.

The math looks bad regardless of which way the industry goes, too. A successful AI industry has a vested interest in bespoke hardware to build better models, faster. A stalled AI industry would want custom hardware to bring down costs and reduce external reliance on competitors. A failed AI industry needs no GPUs at all, and an inference-focused industry definitely wants custom hardware, not general-purpose GPUs.

So nVidia is capitalizing on a bubble, which you could argue is the right move under such market conditions. The problem is that they’re also alienating their core customer base (smaller datacenters, HPC, gaming market) in the present, which will impact future growth. Their GPUs are scarce and overpriced relative to performance, which itself has remained a near-direct function of increased power input rather than efficiency or meaningful improvements. Their software solutions - DLSS frame-generation, ray reconstruction, etc - are locked to their cards, but competitors can and have made equivalent-performing solutions of their own with varying degrees of success. This means it’s no longer necessary to have an nVidia GPU to, say, crunch scientific workloads or render UHD game experiences, which in turn means we can utilize cheaper hardware for similar results. Rubbing salt in the wound, they’re making cards even more expensive by unbundling memory and clamping down on AIB designs. Their competition - Intel and AMD primarily - are happily enjoying the scarcity of nVidia cards and reaping the fiscal rewards, however meager they are compared to AI at present. AMD in particular is sitting pretty, powering four of the five present-gen consoles, the Steam Deck (and copycats), and the Steam Machine, not to mention outfits like Framework; if you need a smol but capable boxen on the (relative) cheap, what used to be nVidia + ARM is now just AMD (and soon, Intel, if they can stick the landing with their new iGPUs).

The business fundamentals paint a picture of cannibalizing one’s evergreen customers in favor of repeated fads (crypto and AI), and years of doing so has left those customer markets devastated and bitter at nVidia’s antics. Short of a new series of GPUs with immense performance gains at lower price and power points with availability to meet demand, my personal read is that this is merely Jenson Huang’s explosive send-off before handing the bag over to some new sap (and shareholders) once the party inevitably ends, one way or another.

bArray|1 month ago

> My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.

Exactly, it is currently priced as though infinite GPUs are required indefinitely. Eventually most of the data centres and the gamers will have their GPUs, and demand will certainly decrease.

Before that, though, the data centres will likely fail to be built in full. Investors will eventually figure out that LLMs are still not profitable, no matter how many data centres you produce. People are interested in the product derivatives at a lower price than it costs to run them. The math ain't mathin'.

The longer it takes to get them all built, the more exposed they all are. Even if it turns out to be profitable, taking three years to build a data centre rather than one year is significant, as profit for these high-tech components falls off over time. And how many AI data centres do we really need?

I would go further and say that these long and complex supply chains are quite brittle. In 2019, a 13 minute power cut caused a loss of 10 weeks of memory stock [1]. Normally, the shops and warehouses act as a capacitor and can absorb small supply chain ripples. But now these components are being piped straight to data centres, they are far more sensitive to blips. What about a small issue in the silicon that means you damage large amounts of your stock trying to run it at full power through something like electromigration [2]. Or a random war...?

> The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.

Yep. Nothing about this adds up. Existing data centres with proper infrastructure are being forced to extend use for previously uneconomical hardware because new data centres currently building infrastructure have run the price up so high. If Google really thought this new hardware was going to be so profitable, they would have bought it all up.

[1] https://blocksandfiles.com/2019/06/28/power-cut-flash-chip-p...

[2] https://www.pcworld.com/article/2415697/intels-crashing-13th...

jpadkins|1 month ago

How much did you short the stock?