AMD is probably undervalued, but Nvidia is clearly way overvalued.
I don't think the reckoning will come from AMD stealing Nvidia's market share, it'll come when the hype bubble collapses and businesses start treating neural networks like commodities, running whatever is cheapest instead of the absolute most powerful. AMD is in a great position, because they make both great GPU/NPU hardware and great CPU hardware.
> AMD is probably undervalued, but Nvidia is clearly way overvalued.
AMD trades at a price to earnings ratio of 99, NVDA trades at a PE of ~38. PE isn't everything when looking at companies, but I don't see other reasons to think AMD is undervalued
NVDA is valued more appropriately than AMD. NVDA’s valuation based on forward earnings is high, but there are many more extreme examples including AMD.
Nvidia is imho kind of a resurgence of large scale 90s Unix systems (ala SGI) [1], and not just by pricing and looks but also if you look at the vertical integration. I think this kind of business setup is more vulnerable to competition from below than people like to think, and it really only takes 1-2 product misses for a big shakeup.
[1] Co-developed proprietary software stacks running on highly proprietary and non-standard hardware targeting very specific workloads.
The way I see it is that Nvidia might reacht $1 trillion in revenue before AMD reaches $100 billion in revenue. So the upside in revenue growth is higher for Nvidia.
People think that because a company has grown very large very quickly that it can't grow as much anymore. But on the other hand, there is clear evidence that Nvidia continues to dominate AMD's offerings despite the latter having a competitive product now. So the metric for Nvidia isn't Nvidia vs. AMD but the growth aspect of AI market overall.
NVIDIA is one of the most undervalued companies in the SPX looking at fundamentals, even disregarding what they are planning next (subscription services for CUDA, and humanoids, etc).
Take a look at their last quarter's income statement graph: https://i.imgur.com/mQwZ5o4.png - Once in a few years I see a Sankey graph looking like that. And it's only growing over the last 10 years.
Is it possible someone will write a CUDA to AMD/Tensor/whatever transpiler (high-level emulator? I'm not sure of the right term) I thought there were a remarkably small number of ops that GPUs perform. Seems like a very high premium to pay for not wanting to rewrite in JAX or whatever.
No it is not actually. They are making insane amounts of money and have very strong forward guidance. With the drop in the last month it is actually cheap (low peg ratio). When the market turns, nvidia is likely to soar once again.
Tell me you haven't looked at Nvidia's financials (especially the margins) without telling me. It basically prints money, now and in the foreseeable future, and all of its products are permanently sold out, even at the insane prices Nvidia is charging.
I have trouble distinguishing this post from those on r/wallstreetbets. To be honest, I have seen quite a few posts on wsb that are much more informative than this
The author is George Hotz, who most famously developed some iOS and PS3 exploits and got sued by Sony. I have litle interest in the content of this particular article and I think your evaluation of the article is fair.
Yes, but that's just his personality. His mind seems to be racing at 200mph whilst the output device (hands, keyboard etc) can't keep up, so some context gets dropped here and there. I remember I had a hard time watching his streams because he'd type at 160wpm or so, but half of the keypresses were correcting mistakes...
You should see his livestreams. Let's just say, he computers the same way his writing comes across (all while swearing up, down and sideways that he's not on anything /s ;) )
"I’m betting on AMD being undervalued, and that the demand for AI has barely started."
I'd love to see AMD get a multiplatorm product so mature that I can pip install PyTorch on Windows or MacOS with an AMD card (https://pytorch.org/get-started/locally/). But I don't think that their market cap will change quickly even if this happens. Many people have been bought AMD cards in the past because they are cheaper and then died waiting for AMD to have a mature CUDA equivalent. Nobody is going to be quickly buying AMD cards as soon as the software is good- they will gradually change when they replace NVDA hardware, and not everyone is going to make that choice.
If I were making a bet (and I'm not), I'd bet that NVDA is overvalued right now and their growth will slow to correct this but it won't crash, and I'd bet that AMD will gradually increase in value to reward them for software investments, but it won't spike as soon as their software looks good. Neither of these things would I want to put a lot of money on, since they are long term bets, and if you're going long then you might as well just invest in the broader market. And even if I thought that NVDA was going to crash and AMD was going to spike, I still wouldn't bet because I have no idea whether it would happen in the next 6 months or 6 years.
I saw a video the other day that showed a new AMD laptop processor that is comparable to the Apple processors in performance and battery life. That was very surprising and also a great thing for windows or Linux laptops. But at the same time, the market for these and the potential for profit isn’t really that big. Consumers are willing to pay for a premium for Apple but not anyone else.
I would really love it if people on Hacker News could weigh in on how much of a moat they think CUDA really is. As in: How hard is it to use something else? If you started a project today how much would you want to get paid to not use CUDA?
A lot of readers on this site have a good insight into this and it is a key question financial people are asking without the knowledge many people here possess.
"As fast as AMD tries to fill in the CUDA moat, NVIDIA engineers are working overtime to deepen said moat with new features, libraries, and performance updates."
AMD's competitor to CUDA is ROCm. Historically, AMD has been hobbled by the quality of their drivers and because they sold less performant hardware. AMD has traditionally been the budget option for both CPUs and GPUs. Things have changed in the CPU space because of Ryzen, but sadly AMD has not been able to realize an equivalent competitive advantage in the GPU space. Intel has also entered the GPU market, but they are even farther behind than AMD. The same problems I am about to describe apply to them as well, to a higher degree.
Rewriting CUDA programs to run using ROCm is expensive and time consuming. It is difficult to justify this expense when in all likelihood the ROCm version will be less efficient, less performant, and less stable than the original. In the grand scheme of things, AMD hardware is indeed cheaper but it's not that much cheaper. From a business standpoint, it's just not worth it.
Knowing what I know about how management thinks, even if AMD managed to make an objectively superior product at a much better price, institutional momentum alone would keep people on CUDA for a long time.
One aspect that influences is how close to the bleeding edge one needs to be. And how niche the model/application is.
ROCm lags by some years. And application/model/framework developers test less on it, which can be problematic in niches.
For doing something very established like say image classification, that does not really matter - 3 year old CNNs will generally do the trick.
But if on wants to drop in model X just put on GitHub/HuggingFace the last year, one would be buying a lot of trouble.
Whenever a new AI model gets released and is available for the public. From the last few I've tried they were always NVIDIA only because I assume that's what the researchers had at their disposal.
RDNA 4 is proving that AMD can be competitive in GPUs. Is it on par with Blackwell? No. Is is a much better improvement over previous gen than Blackwell? Yes, at least if you consider consumer pricing/marketing.
I like the guy but he is wrong. Besides the CUDA ecosystem, Nvidia hardware has a lot of features and a lot of things to yet optimize. Like optimizations by DeepSeek (non-CUTLASS custom models and DualPipe). I think Nvidia's current hardware has plenty of legs and that's why they are not rushed on releasing next gen chips.
The actual challenger is Cerebras. No need to load (VRAM->SRAM) all the parameters for every batch . But they have yet to prove they can scale and support the custom stack. We'll see.
I heard that Nvidia's graph cards are the best in the class in terms of power consumption vs TFLOP ration. I wonder what's the number of AMD vs Nvidia? I would like to see the number because power consumption is going to take a big portion of AI training. In comparsion, hardware might not be that expensive in the long run.
dlcarrier|1 year ago
I don't think the reckoning will come from AMD stealing Nvidia's market share, it'll come when the hype bubble collapses and businesses start treating neural networks like commodities, running whatever is cheapest instead of the absolute most powerful. AMD is in a great position, because they make both great GPU/NPU hardware and great CPU hardware.
Jalad|1 year ago
AMD trades at a price to earnings ratio of 99, NVDA trades at a PE of ~38. PE isn't everything when looking at companies, but I don't see other reasons to think AMD is undervalued
solumunus|1 year ago
formerly_proven|1 year ago
[1] Co-developed proprietary software stacks running on highly proprietary and non-standard hardware targeting very specific workloads.
Jlagreen|11 months ago
People think that because a company has grown very large very quickly that it can't grow as much anymore. But on the other hand, there is clear evidence that Nvidia continues to dominate AMD's offerings despite the latter having a competitive product now. So the metric for Nvidia isn't Nvidia vs. AMD but the growth aspect of AI market overall.
ZeroTalent|1 year ago
Take a look at their last quarter's income statement graph: https://i.imgur.com/mQwZ5o4.png - Once in a few years I see a Sankey graph looking like that. And it's only growing over the last 10 years.
xnx|1 year ago
rapsey|1 year ago
No it is not actually. They are making insane amounts of money and have very strong forward guidance. With the drop in the last month it is actually cheap (low peg ratio). When the market turns, nvidia is likely to soar once again.
ein0p|1 year ago
Tell me you haven't looked at Nvidia's financials (especially the margins) without telling me. It basically prints money, now and in the foreseeable future, and all of its products are permanently sold out, even at the insane prices Nvidia is charging.
rs186|1 year ago
qwery|1 year ago
y-curious|1 year ago
Cornbilly|1 year ago
karolist|1 year ago
asadm|1 year ago
stefan_|1 year ago
slater|1 year ago
htrp|1 year ago
So basically he got 2 MI300s and is currently trying to pump AMD?
littlestymaar|1 year ago
Not two cards, two boxes (with 8 cards each I'd assume).
parsimo2010|1 year ago
I'd love to see AMD get a multiplatorm product so mature that I can pip install PyTorch on Windows or MacOS with an AMD card (https://pytorch.org/get-started/locally/). But I don't think that their market cap will change quickly even if this happens. Many people have been bought AMD cards in the past because they are cheaper and then died waiting for AMD to have a mature CUDA equivalent. Nobody is going to be quickly buying AMD cards as soon as the software is good- they will gradually change when they replace NVDA hardware, and not everyone is going to make that choice.
If I were making a bet (and I'm not), I'd bet that NVDA is overvalued right now and their growth will slow to correct this but it won't crash, and I'd bet that AMD will gradually increase in value to reward them for software investments, but it won't spike as soon as their software looks good. Neither of these things would I want to put a lot of money on, since they are long term bets, and if you're going long then you might as well just invest in the broader market. And even if I thought that NVDA was going to crash and AMD was going to spike, I still wouldn't bet because I have no idea whether it would happen in the next 6 months or 6 years.
wavemode|1 year ago
AMD is not undervalued, rather it is Nvidia that is overvalued.
blackeyeblitzar|1 year ago
panstromek|1 year ago
georgeecollins|1 year ago
A lot of readers on this site have a good insight into this and it is a key question financial people are asking without the knowledge many people here possess.
czk|1 year ago
"As fast as AMD tries to fill in the CUDA moat, NVIDIA engineers are working overtime to deepen said moat with new features, libraries, and performance updates."
BitwiseFool|1 year ago
Rewriting CUDA programs to run using ROCm is expensive and time consuming. It is difficult to justify this expense when in all likelihood the ROCm version will be less efficient, less performant, and less stable than the original. In the grand scheme of things, AMD hardware is indeed cheaper but it's not that much cheaper. From a business standpoint, it's just not worth it.
Knowing what I know about how management thinks, even if AMD managed to make an objectively superior product at a much better price, institutional momentum alone would keep people on CUDA for a long time.
jononor|1 year ago
ChocolateGod|1 year ago
There's movement to implement CUDA libraries that work on non-Nvidia cards, but I guess adoption could be hindered by legal fears.
https://github.com/vosen/ZLUDA
r1chardnl|1 year ago
ljlolel|1 year ago
LorenDB|1 year ago
htrp|1 year ago
alecco|1 year ago
The actual challenger is Cerebras. No need to load (VRAM->SRAM) all the parameters for every batch . But they have yet to prove they can scale and support the custom stack. We'll see.
fangpenlin|1 year ago
mika6996|1 year ago
fabiensanglard|1 year ago
It seems ADB is just sending more hardware. As far as I know the drivers are still lacking.
waltercool|1 year ago
[deleted]
unknown|1 year ago
[deleted]