Investing billionS over the course of CUDA and its predecessors. The whole concept about wanting to use GPU for not just Graphics but High performance or Highly parallel computing started before 2000s. CUDA announced in 2007, and most of the work predate back to Cg in early 2000s. Even Intel who were already very late to the party made the decision to go invest and start Larrabee in 2005. And there was PS3 Cell, which started development in 2001.
And yet all these work and success from Nvidia was because of, if you read 90% of HN comments for the past 2 years; Luck.
They could have given up at any point in time for the past 20 years and simply not do anything CUDA or GPGPU related. Because who would want to do that when vast majority of those investment were not even bringing in much revenue. Like Intel decided to cancel Larrabee. They persevere and hit the Jackpot some 10-15 years later. But all of this was because of; Luck.
Yes. Luck plays a big part. They could have continue another 10 years and they may never find the Killer App for it. But to ignore all the investment and work for such a long time and pin it down to Luck was about as rude and as disrespectful it can be. Especially on a forum which was started by VC with the spirit of entrepreneurship.
No, Nvidia just found themselves in a lucky situation.
They were already building GPUs mainly for gaming. Crypto then came along and swiped up a bunch of GPUs. When the self mining craze somewhat waned the AI craze started as the boundaries on its ethicality were broken down in an uncertain economic environment where corporations started racing to see who will get to the top.
As Seneca is said to have quipped- “Fortuna est quae fit cum praeparatio in occasionem incidit." or "Luck is what happens when preparation meets opportunity.”
Nvidia has been doing the hard work in preparing to succeed in this market. CUDA has been meticulously developed and maintained, creating an adhesion to their hardware that would not otherwise exist in the AI market.
It also has been willing and capable of creating lines of business hardware aimed at maximizing utility for their customers.
They also have hired and maintained a roster of the best engineers in their specialties, including the software part of the equation.
There is no part of their success that they weren't prepared to take advantage of when the opportunity presented itself. They didn't control the size of the opportunity itself, but no greatly successful company does.
IIRC were on CUDA 5 or something when imagenet came out and changed the world.
They might not have imagined LLMs when they decided to invest in making their GPUs programmable but I guarantee you they extrapolated the future compute potential of vector programmable machines and decided it was not a huge risk to enable it as it is simply betting that some important application would be around to tap into it.
I think that's oversimplifying things. The acquired podcast has covered Nvidia's different growth periods in depth. I highly recommend anyone interested to give them a listen:
>I'm a great believer in luck. The harder I work, the more of it I seem to have.
Nvidia has been working hard(er than their competition) on the software side for almost 2 decades to be in the position they find themselves today. 16 years ago, they released CUDA for general-purpose computing on GPUs, and then 9 years ago they followed that up with cuDNN. They have a consistent pattern of making a intentional, long-term bets to diversify their market exposure and unlock new product areas while building a software ecosystem moat.
Yes, they obviously got super lucky with the cryptocurrency frenzy, but there's a reason all the miners were mostly buying Nvidia cards instead of AMD cards.
No, Nvidia decided to change their GPU architectures to be more suitable for neural networks, so they did have to bet that this shift would have to pay off. They spoke to many leading AI experts and came to this conclusion. They should be commended for the risk they took. If Nvidia solely ended up being "lucky", then how come AMD didn't take off?
> No, Nvidia just found themselves in a lucky situation.
Well yes and no. They were certainly lucky to be at right place in the right time. But they were also consistently investing into CUDA and the AI/ML ecosystem while their competitor(s) ignored it to such an extent that NVDIA became the only real option (and deservedly so).
This is why they can effectively behave like a monopoly these days and just almost inconceivably high margins.
Then this luck should have equally found AMD, who even today are struggling to pick up the ball they've been dropping for a decade now. My last PC had a Radeon, and I waited the life time of that PC assuming AMD support was just around the corner, all the while renting Nvidia cards in the cloud for any serious projects.
I've been in the ML space long enough to remember when people were just speculating about doing ML/computation on GPUs. Nvidia made that much easier and has continued to improve support and features for the past decade+ There insane success is certainly part luck, but I wouldn't be so quick to dismiss all of it as merely happenstance.
This timeline seems completely wrong to me. Nvidia’s cudnn has been the only game in town for NN research since I have been in the field, and predates ethereum and the crypto bull run by a few years. If anything they didn’t waver and jump too hard on the crypto bandwagon when the craze was at its height.
No, Nvidia was not lucky. They are a strong engineering culture who also has deep marketing expertise in 3D in all its facets for decades .
Nvidia wisely recognized that only so much horsepower could be used by a conventional 3d graphics pipeline with a given screen resolution, and they needed to invest in growing future compute-heavy adjacent markets.
They invested in generalizing their GPU into a more flexible vector coprocessor for HPC and then adjacent markets. They convinced fundamental engineers and researchers in this area to come work for them.
There was deep fundamental work done by Ian Buck in 2004 on leveraging GPUs as general vector processors ( https://graphics.stanford.edu/papers/brookgpu/ ) and that leadership and deep thinking went to Nvidia, not to Intel. Intel did not have the passion from the top to care about this. They couldn't even care enough to field competitive 3D chips (and associated software), much less extend their thinking to generalize beyond it skillfully. Nvidia did.
Anyone who spent every day thinking about how to grow the vector coprocessor market would have pursued crypto and AI when they came along but Nvidia's strong engineering and profits from a leading 3D position gave them competitive advantages which they are, for now, reaping.
Not really. They did push CUDA and GPGPU on their own hardware while AMD and Intel offered a barely functional OpenCL. Of course they had no idea about how big AI and crypto would become, but they were there offering their cards to whoever wanted to run calculations on them.
Don't forget the "Artificial Intelligence and Japan's Fifth Generation [Project]" [1] launched in 1982. Bad timing ;-). Wikipedia includes a specific page [2]. Finally, we cannot forget Transputers [3].
No, Nvidia has spent considerable resources to support ML for I think a decade now? They certainly got lucky in the crypto craze and of course the timing of AI off the failings of crypto is extremely lucky but you are wrong in the rest of the argument.
If it was pure luck why did they build CUDA over a decade ago?
Think you're just looking at this from the angle of a gamer and not someone who's been paying attention to GPGPU compute earlier than the past 6 months.
I'm watching nvidia researchers doing a TON of AI at a wide spectrum, that your 'lucky situation' is a hard understatement what nvidia is doing on Research & Development area.
You are right. Even more interesting, is the timing itself:
The crypto craze waned just as LLMs were picking up. Any more delay with LLMs, and perhaps nvidia would have been overextended with no demand to take their inventory. There may in fact , be no nvidia in that future depending on the level of the bet.
Or take it 1 step further . if COVID hadnt happened, the crypto craze would never have occurred, and nvidia would have only been a bit-level player in the LLM craze we are now in.
I still remember nvidia ads in pc game magazines. That and 3dfx. Who knew?
Chalking it up to dumb luck is kind of ridiculous. They invested massively in developing an ecosystem for GPU driven computing, which was a well reasoned gamble, and it paid off.
> In 2003, a team of researchers led by Ian Buck unveiled Brook, the first widely adopted programming model to extend C with data-parallel constructs. Ian Buck later joined NVIDIA and led the launch of CUDA in 2006, the world's first solution for general-computing on GPUs.
you're missing the software part, which is essential to this story. Nvidia was the only GPU maker betting the farm on positioning for GPGPU/HPC all the way back in ~2007. What is happening now and since the last couple of years is just payoff for that massive R&D and software maintenance cost they've been fronting. They bet big and won big.
I clearly remember AI becoming a big tech subject in the late 2000s/early 2010s following the release of CUDA on consumer hardware, around the same time as Bitcoin and much before GPU-mining was a thing.
The article mentions this choice being made in 2018. I don't know the ML industry, but as a gamedev I had been mystified about Nvidia's strategy since about 2018.
It felt like they weren't leaning into crypto, which surprised me. Instead it looked like they were trying to maintain gamer goodwill by not increasing consumer card costs during the boom. Of course scarcity raised secondary market princes, but Nvidia kept MSRP lower than the boom dictated.
It seemed like they were betting against crypto during the craze. And sticking to their strategy on the consumer side. So maybe that's how they stuck to a ML strategy too.
So they had all this CUDA stuff, which they must have invested in heavily because AMD showed what happens when you don't. That led to a software ecosystem for ML.
Maybe it was all luck, but a strategic choice explains some of this in hindsight.
Always a divide between people in these threads who assign pure genius as the sole reason for success and those who live in reality and accept that you also need allot of luck and cash on the way.
Still seems like they were in the right place at the right time. When they first developed Cuda it was a hammer in search of a nail. Then they had the huge windfall from crypto-mining. That probably led to a lot of discussions on looking for GPGPU opportunities. Then AI came along and Cuda was just sitting there.
Then they put two and two together and started investing heavily as they saw momentum build.
>Then they put two and two together and started investing heavily as they saw momentum build.
There's been a decade or more of deep learning models breaking records in almost every single research field, powered (indirectly) through CUDA, cuDNN and other NVIDIA software.
AI didn't "come along" when OpenAI released ChatGPT. DNNs that have been 99% NVIDIA-focused have been beating the state-of-the-art for years and years.
Also for the record the ADA architecture (very dominant AI accelerator) was released when the stock price was like $100 (compared to the $500 now).
Lot of things were happening before you started paying attention to it. GPGPU definitely wasn't just there being useless before crypto and AI didn't just "Come along" after crypto.
CUDA has been heavily utilized for AI for many many years now, whole reason Nvidia is so entrenched in it is because they were they only ones taking GPGPU seriously like OpenCL (1) rose and was abandoned before we even get to your interpretation of the timeline.
(1): Easy to forget now that AMD and Apple had an common standard competitor to CUDA and completely fumbled it.
As the article implies, they were lucky and good. I was surprised how quickly they were able to implement Tensor cores and head off alternate architectures.
When they first developed CUDA, they did not yet have that huge of a windfall from crypto-mining. It was first released in June 2007. I would say it was more of the result of seeing initiatives of ham-fisting general-purpose computation onto graphics-specific API like OpenGL and such.
They'll also benefit from the boom on the graphics side as well.
Once gen AI gets good enough to generate high quality video games, virtual worlds, etc. in real time, that will redefine gaming and entertainment.
Why wait for the next Mission Impossible movie to come out when you can experience it...as Tom Cruise...with your own storyline with all of your friends. And get a new one every day.
I have to admit I regret not buying their stock. It should've been obvious this would be a boon for them, but I got distracted by the crypto hype and didn't even think about the impact AI would have on them.
True story, decades ago in HS we did the whole stock market competition. As a big gamer, I somehow convinced my team to dump everything into Nvidia.
We were 2nd in the state in Illinois, and I wanted to cement our place and possibly win by selling everything to lock in gains in the final week. The person executing the trade on our team accidentally shorted it and it went up a considerable amount in the last week knocking us down quite a bit.
At that point I seriously considered dumping my savings into Nvidia. I'd be retired right now if I had done so.
I wonder what will happen if someone figures out that fused FP mult add is no longer needed (e.g. just count spikes and add subtract permanence). This could be a big problem for the guys with all their eggs in one basket (like NVIDIA).
Actually, a GPU is not the best kind of hardware for AI (or rather, neural network simulation). It's very well-developed, and more useful than a CPU, but one could design and construct more NN-sim-oriented hardware. NVIDIA have been forced to do this somewhat, cannibalizing some of their more general-purpose compute capabilities in favor of matrix-multiply-add functionality (i.e. "tensor cores"). That's not exactly the "GPU way" of doing things. And one could go every further, perhaps all the way up to some analog computations instead of digital low-precision ones.
Once they changed the licensing on consumer gear in a DC it started to become clear they were executing on a strategy, I wouldn't be surprised if phasing out SLI was part of that strategy as well.
I think AIs work loads match what GPU does too well, some luck in it for sure but credit them making Cuda nice and early and doing bigger bets on AI than anyone else.
To say that the AI craze started after the crypto wave died down is disingenuous or misinformed. The future need for GPU's was apparent as soon as machine learning became relevant a decade ago. The effectiveness of transformers, rnn's, q-learning, etc, for language and other applications was not news when GPT-3 launched. NVidia invested heavily for 10 years, including funding research towards ML and AI, and steered the direction of the technology that we have today
I could definitely see them having the foresight to predict that the future would pay good money for a bulk mathematical operations of a certain type. I have trouble imagining them predicting AI specifically, when crypto and gaming were paying the bills so well. If I went back to 2018, I wonder if he'd say they bet the company on bitcoin back in 2012.
The catch was they are betting multiple farms. One farm for crypto, another farm for AI, and I'm sure they'll have another farm for another hyped technology.
[+] [-] ksec|2 years ago|reply
And yet all these work and success from Nvidia was because of, if you read 90% of HN comments for the past 2 years; Luck.
They could have given up at any point in time for the past 20 years and simply not do anything CUDA or GPGPU related. Because who would want to do that when vast majority of those investment were not even bringing in much revenue. Like Intel decided to cancel Larrabee. They persevere and hit the Jackpot some 10-15 years later. But all of this was because of; Luck.
Yes. Luck plays a big part. They could have continue another 10 years and they may never find the Killer App for it. But to ignore all the investment and work for such a long time and pin it down to Luck was about as rude and as disrespectful it can be. Especially on a forum which was started by VC with the spirit of entrepreneurship.
[+] [-] 1-6|2 years ago|reply
They were already building GPUs mainly for gaming. Crypto then came along and swiped up a bunch of GPUs. When the self mining craze somewhat waned the AI craze started as the boundaries on its ethicality were broken down in an uncertain economic environment where corporations started racing to see who will get to the top.
[+] [-] mrkstu|2 years ago|reply
Nvidia has been doing the hard work in preparing to succeed in this market. CUDA has been meticulously developed and maintained, creating an adhesion to their hardware that would not otherwise exist in the AI market.
It also has been willing and capable of creating lines of business hardware aimed at maximizing utility for their customers.
They also have hired and maintained a roster of the best engineers in their specialties, including the software part of the equation.
There is no part of their success that they weren't prepared to take advantage of when the opportunity presented itself. They didn't control the size of the opportunity itself, but no greatly successful company does.
[+] [-] DSingularity|2 years ago|reply
IIRC were on CUDA 5 or something when imagenet came out and changed the world.
They might not have imagined LLMs when they decided to invest in making their GPUs programmable but I guarantee you they extrapolated the future compute potential of vector programmable machines and decided it was not a huge risk to enable it as it is simply betting that some important application would be around to tap into it.
[+] [-] dataking|2 years ago|reply
https://www.acquired.fm/episodes/nvidia-the-gpu-company-1993...
https://www.acquired.fm/episodes/nvidia-the-machine-learning...
https://www.acquired.fm/episodes/nvidia-the-dawn-of-the-ai-e...
[+] [-] Gh0stRAT|2 years ago|reply
>I'm a great believer in luck. The harder I work, the more of it I seem to have.
Nvidia has been working hard(er than their competition) on the software side for almost 2 decades to be in the position they find themselves today. 16 years ago, they released CUDA for general-purpose computing on GPUs, and then 9 years ago they followed that up with cuDNN. They have a consistent pattern of making a intentional, long-term bets to diversify their market exposure and unlock new product areas while building a software ecosystem moat.
Yes, they obviously got super lucky with the cryptocurrency frenzy, but there's a reason all the miners were mostly buying Nvidia cards instead of AMD cards.
[+] [-] mliker|2 years ago|reply
[+] [-] qwytw|2 years ago|reply
Well yes and no. They were certainly lucky to be at right place in the right time. But they were also consistently investing into CUDA and the AI/ML ecosystem while their competitor(s) ignored it to such an extent that NVDIA became the only real option (and deservedly so).
This is why they can effectively behave like a monopoly these days and just almost inconceivably high margins.
[+] [-] PheonixPharts|2 years ago|reply
Then this luck should have equally found AMD, who even today are struggling to pick up the ball they've been dropping for a decade now. My last PC had a Radeon, and I waited the life time of that PC assuming AMD support was just around the corner, all the while renting Nvidia cards in the cloud for any serious projects.
I've been in the ML space long enough to remember when people were just speculating about doing ML/computation on GPUs. Nvidia made that much easier and has continued to improve support and features for the past decade+ There insane success is certainly part luck, but I wouldn't be so quick to dismiss all of it as merely happenstance.
[+] [-] KaoruAoiShiho|2 years ago|reply
https://www.youtube.com/watch?v=WLq9zv3k5n0
The race was on, but nobody else was running.
https://www.youtube.com/watch?v=Yhg3IEpl60M
[+] [-] latchkey|2 years ago|reply
It didn't wane, it was decimated when ETH switched to PoS, and off of GPU mining entirely.
All of the other mined coins dropped in value as miners moved to them and dumped all their rewards, making mining those coins unprofitable as well.
It was ETH that was propping up the entire GPU mining ecosystem.
[+] [-] mattnewton|2 years ago|reply
[+] [-] gregw2|2 years ago|reply
Nvidia wisely recognized that only so much horsepower could be used by a conventional 3d graphics pipeline with a given screen resolution, and they needed to invest in growing future compute-heavy adjacent markets.
They invested in generalizing their GPU into a more flexible vector coprocessor for HPC and then adjacent markets. They convinced fundamental engineers and researchers in this area to come work for them.
There was deep fundamental work done by Ian Buck in 2004 on leveraging GPUs as general vector processors ( https://graphics.stanford.edu/papers/brookgpu/ ) and that leadership and deep thinking went to Nvidia, not to Intel. Intel did not have the passion from the top to care about this. They couldn't even care enough to field competitive 3D chips (and associated software), much less extend their thinking to generalize beyond it skillfully. Nvidia did.
Anyone who spent every day thinking about how to grow the vector coprocessor market would have pursued crypto and AI when they came along but Nvidia's strong engineering and profits from a leading 3D position gave them competitive advantages which they are, for now, reaping.
[+] [-] xnx|2 years ago|reply
[+] [-] Maken|2 years ago|reply
[+] [-] wslh|2 years ago|reply
[1] https://www.jstor.org/stable/26861060
[2] https://en.wikipedia.org/wiki/Fifth_Generation_Computer_Syst...
[3] https://en.wikipedia.org/wiki/Transputer
[+] [-] infecto|2 years ago|reply
[+] [-] whywhywhywhy|2 years ago|reply
Think you're just looking at this from the angle of a gamer and not someone who's been paying attention to GPGPU compute earlier than the past 6 months.
[+] [-] Blehmo|2 years ago|reply
[+] [-] IG_Semmelweiss|2 years ago|reply
The crypto craze waned just as LLMs were picking up. Any more delay with LLMs, and perhaps nvidia would have been overextended with no demand to take their inventory. There may in fact , be no nvidia in that future depending on the level of the bet.
Or take it 1 step further . if COVID hadnt happened, the crypto craze would never have occurred, and nvidia would have only been a bit-level player in the LLM craze we are now in.
I still remember nvidia ads in pc game magazines. That and 3dfx. Who knew?
[+] [-] SalmoShalazar|2 years ago|reply
[+] [-] throw0101c|2 years ago|reply
Copy-pasting a comment from a discussion a little while[1] ago: CUDA was first released in 2007:
* https://en.wikipedia.org/wiki/CUDA
* https://developer.download.nvidia.com/compute/cuda/1.0/NVIDI...
Two years before the Bitcoin paper (2009):
* https://en.wikipedia.org/wiki/Bitcoin
They had a presentation called "The Era of the Personal Supercomputing" at SIGGRAPH 2007:
* https://dl.acm.org/doi/10.1145/1281500.1281647
* https://www.nvidia.com/content/events/siggraph_2007/supercom...
Ian Buck (co-?)creator of CUDA speaking in 2008:
> Ian Buck talks about his background developing Brook for GPUs at Stanford university and what paths were taken for developing a C platform for GPUs.
* https://www.youtube.com/watch?v=Cmh1EHXjJsk
> In 2003, a team of researchers led by Ian Buck unveiled Brook, the first widely adopted programming model to extend C with data-parallel constructs. Ian Buck later joined NVIDIA and led the launch of CUDA in 2006, the world's first solution for general-computing on GPUs.
* https://developer.nvidia.com/cuda-zone
* http://graphics.stanford.edu/~ianbuck/
Nvidia purposefully went after parallel computing. Specific applications (cryptocurrency, ML/AI) appeared later.
[1] https://news.ycombinator.com/item?id=38446957#unv_38447944
[+] [-] m_mueller|2 years ago|reply
[+] [-] Bayart|2 years ago|reply
[+] [-] didip|2 years ago|reply
It is true that they got lucky several times in a big way. But CUDA was an expensive R&D for many years without clear payouts.
[+] [-] EarthLaunch|2 years ago|reply
It felt like they weren't leaning into crypto, which surprised me. Instead it looked like they were trying to maintain gamer goodwill by not increasing consumer card costs during the boom. Of course scarcity raised secondary market princes, but Nvidia kept MSRP lower than the boom dictated.
It seemed like they were betting against crypto during the craze. And sticking to their strategy on the consumer side. So maybe that's how they stuck to a ML strategy too.
So they had all this CUDA stuff, which they must have invested in heavily because AMD showed what happens when you don't. That led to a software ecosystem for ML.
Maybe it was all luck, but a strategic choice explains some of this in hindsight.
[+] [-] topherclay|2 years ago|reply
Woah, only one more year and this reference will be old enough to vote.
[+] [-] mensetmanusman|2 years ago|reply
[+] [-] drunkan|2 years ago|reply
[+] [-] chongli|2 years ago|reply
Then they put two and two together and started investing heavily as they saw momentum build.
[+] [-] sevagh|2 years ago|reply
There's been a decade or more of deep learning models breaking records in almost every single research field, powered (indirectly) through CUDA, cuDNN and other NVIDIA software.
AI didn't "come along" when OpenAI released ChatGPT. DNNs that have been 99% NVIDIA-focused have been beating the state-of-the-art for years and years.
Also for the record the ADA architecture (very dominant AI accelerator) was released when the stock price was like $100 (compared to the $500 now).
[+] [-] whywhywhywhy|2 years ago|reply
CUDA has been heavily utilized for AI for many many years now, whole reason Nvidia is so entrenched in it is because they were they only ones taking GPGPU seriously like OpenCL (1) rose and was abandoned before we even get to your interpretation of the timeline.
(1): Easy to forget now that AMD and Apple had an common standard competitor to CUDA and completely fumbled it.
[+] [-] johngossman|2 years ago|reply
[+] [-] einpoklum|2 years ago|reply
[+] [-] blackoil|2 years ago|reply
Yes, and it takes lot of effort to be in the right place.
[+] [-] gmays|2 years ago|reply
Once gen AI gets good enough to generate high quality video games, virtual worlds, etc. in real time, that will redefine gaming and entertainment.
Why wait for the next Mission Impossible movie to come out when you can experience it...as Tom Cruise...with your own storyline with all of your friends. And get a new one every day.
[+] [-] itslennysfault|2 years ago|reply
[+] [-] shostack|2 years ago|reply
We were 2nd in the state in Illinois, and I wanted to cement our place and possibly win by selling everything to lock in gains in the final week. The person executing the trade on our team accidentally shorted it and it went up a considerable amount in the last week knocking us down quite a bit.
At that point I seriously considered dumping my savings into Nvidia. I'd be retired right now if I had done so.
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] latchkey|2 years ago|reply
Nvidia CEO: We bet the farm on AI and no one knew it https://news.ycombinator.com/item?id=37055375 (August 8, 2023 — 10 points, 4 comments)
[+] [-] xpuente|2 years ago|reply
[+] [-] einpoklum|2 years ago|reply
[+] [-] clwg|2 years ago|reply
[+] [-] m3kw9|2 years ago|reply
[+] [-] broast|2 years ago|reply
[+] [-] karaterobot|2 years ago|reply
[+] [-] cybereporter|2 years ago|reply
[+] [-] amelius|2 years ago|reply
[+] [-] itomato|2 years ago|reply
Is this acknowledging the game market is dry?
[+] [-] parentheses|2 years ago|reply
Ray Tracing was the next frontier of visual quality. The compute was at a level where something like this was possible.
DLSS was not a "huge gambit". It was an application of AI models (many of which train and run on NVidia chips already) to the gaming space.
Making it sound like it was a "genius move" is great for promoting NVidia as being an innovator.