Are they buying them to try and slow down open source models and protect the massive amounts of money they make from OpenAI, Anthropic, Meta ect?
It quite obvious that open source models are catching up to closed source models very fast they about 3-4 months behind right now, and yeah they are trained on Nvidia chips, but as the open source models become more usable, and closer to closed source models they will eat into Nvidia profit as these companies aren't spending tens of billion dollars on chips to train and run inference. These are smaller models trained on fewer GPUs and they are performing as good as the pervious OpenAI and Anthropic models.
So obviously open source models are a direct threat to Nvidia, and they only thing open source models struggle at is scaling inference and this is where Groq and Cerberus come into the picture as they provide the fastest inference for open source models that make them even more usable than SOTA models.
Shy of an algo breakthrough, open source isn't going to catch up with SOTA, their main trick for model improvement is distilling the SOTA models. That's why they they have perpetually been "right behind".
>Are they buying them to try and slow down open source models
The opposite, I think.
Why do you think that local models are a direct threat to Nvidia?
Why would Nvidia let a few of their large customers have more leverage by not diversifying to consumers? Openai decided to eat into Nvidia's manufacturing supply by buying DRAM; that's concretely threatening behavior from one of Nvidia's larger customers.
If Groq sells technology that allows for local models to be used better, why would that /not/ be a profit source for Nvidia to incorporate? Nvidia owes a lot of their success on the consumer market. This is a pattern in the history of computer tech development. Intel forgot this. AMD knows this. See where everyone is now.
Besides, there are going to be more Groqs in the future. Is it worth spending ~20B for each of them to continue to choke-hold the consumer market? Nvidia can afford to look further.
It'd be a lot harder to assume good faith if Openai ended up buying Groq. Maybe Nvidia knows this.
Yes, you are way off, because Groq doesn't make open source models. Groq makes innovative AI accelerator chips that are significantly faster than Nvidia's.
You still need hardware to run open source models. It might eat into OpenAI profit but I doubt it will eat into NVIDIA's
If anything more companies in making models business the higher NVIDIA chip demand will be, till we get some proper competition at least. We badly need some open CUDA equivalent so moving off to competition isn't a problem
Nvidia just released their Nemotron models, and in my testing, they are the best performing models on low-end consumer hardware in both terms of speed and accuracy.
I'd say that it's probably not a play against open source, but more trying to remove/change the bottlenecks in the current chip production cycle. Nvidia likely doesn't care who wins, they just want to sell their chips. They literally can't make enough to meet current demand. If they split off the inference business (and now own one of the only purchasable alternatives) they can spin up more production.
That said, it's completely anti-competitive. Nvidia could design a inference chip themselves, but instead the are locking down one of the only real independents. But... Nobody was saying Groq was making any real money. This might just be a rescue mission.
They need to vertically integrate the entire stack or they die. All of the big players are already making plans for their own chips/hardware. They see everyone else competing for the exact same vendor’s chips and need to diversify.
With RAM/memory price this high, open source is not going to catch up with closed source.
The open source economy relies on the wisdom of crowds. But that implies and equal access to experimentation platforms. The democratization of PC and consumer hardware brings the previous open source era that we all love, I am afraid the tech mongols had identified the chokehold of LLM ecosystem and found ways to successfully monopolized it
NVIDIA makes money no matter if the model is open weights or not. I don't think open is a concern for them and they'd very much like to be servicing China and their batch of open models I think. what's concerning them more likely is
A. The inevitable breakdown of their massive head start with CUDA and data center hardware. A serious competitor at real scale.
B. Anything that'll cool off the massive data center buildouts that are fueling them.
Seems clear that locking up a major potential competitor especially the minds behind it solves for A. And their ongoing machinations with circular funding of companies funding data centers is all about B - keeping the momentum before it fizzles.
More like they’re trying to snuff out potential competitors. Why work as hard to push your own products if NVIDIA gave you money to retire for the rest of your life?
The constant threat of open source (and other competitors) is what keeps the big fish from getting complacent. It’s why they’re spending trillions on new data centers, and that benefits Nvidia. When there’s an arms-race on it’s good to be an arms dealer.
Idk- cheaper inference seems to be a huge industry secret and providing the best inference tech that only works with nvidia seems like a good plan. Makes nvidia the absolute king of compute against AWS/AMD/Intel seems like a no brainer.
Your way off, this reads more like anti capitalist political rhetoric than real reasoning.
Look at Nvidia nemotron series. They hav become a leading open source training lab themselves and they’re releasing the best training data, training tooling, and models at this point.
It’ll go through. It’s not an acquisition, it’s an exclusive licensing deal. Same end result, but it lets them runaround the usual regulatory approvals for acquisitions.
The price is 40x their target revenue. That's twice the price to revenue multiplier applied to Anthropic in their most recent funding round, and really really hard to portray as a good deal.
I don't think it really helps Nvidia's competitive position. The serious competition to Nvidia is coming from Google's TPU, Amazon's Trainium, AMD's Instinct, and to a much lesser extent Intel's ARC.
Grow recent investors got back a 3x multiple and may now invest in one of Nvidia's other competitors instead.
The only thing I can think of here is that OpenAI’s DRAM land grab is going to stack on a non-NV target and NV need to hedge with an SRAM design that’s on the market NOW. Otherwise, I can’t see how NV couldn’t eat Groq’s lunch in one development cycle - it’s not like NV can’t attach a TPU to some SRAM and an interconnect. Either that or Groq closed a deep enough book to scare them, but 40x is a lot of scared.
> Today, Groq announced that it has entered into a non-exclusive licensing agreement with Nvidia for Groq’s inference technology. The agreement reflects a shared focus on expanding access to high-performance, low cost inference.
> As part of this agreement, Jonathan Ross, Groq’s Founder, Sunny Madra, Groq’s President, and other members of the Groq team will join Nvidia to help advance and scale the licensed technology.
> Groq will continue to operate as an independent company with Simon Edwards stepping into the role of Chief Executive Officer.
> GroqCloud will continue to operate without interruption.
> As part of this agreement, Jonathan Ross, Groq’s Founder, Sunny Madra, Groq’s President, and other members of the Groq team will join Nvidia to help advance and scale the licensed technology.
A really strange agreement where top executives of a company "join" another company for the benefit of the other company.
This seems a lot like something where the acquirer avoids paying for equity. With key leaders gone what do employees of Groq get? Their company isn’t being acquired really so they just stay illiquid?
Legit feels like Nvidia just buying out competition to maintain their position and power in the industry. I sincerely hope they fall flat on their face.
I just stopped my Groq API. Sad to see competition being eaten up by shitty Nvidia. I like their products but Jensen is an absolute mfer with deceitful marketing.
Not good. This shouldn't be allowed. What would be better is if groq and cerebras combined, and maybe other companies invested in them to help them scale. Why would the major cloud providers not lobby against this?
Usually antitrust is for consumers, but here I think companies like Microsoft and AWS would be the biggest beneficiaries of having more AI chip competition.
I do not understand this move by Nvidia, they are afraid of being out competed by this startup in their core competence of building chips for AI? They may be eliminating a competitor for now but this move will immediately many more AI chip startups to get founded
Is there less regulatory oversight when purchasing assets instead of the company, or do Nvidia really believe the FTC/DOJ are that blind? (Or doesn’t it matter in the current climate?)
The near exclusive global provider of AI chips taking key employees from and “licensing” the technology of the only serious competitor while quite specifically describing it as “not acquiring Groq as a company” seems quite obviously anti-competitive, and quite clearly an attempt to frame it as not.
Someone say to me that Nvidia is not buying Groq; the deal is a non-exclusive licensing agreement for its inference technology with some team members joining Nvidia.
This is smart as hell. I’ve long wondered how they’d combat ASIC’s without diluting their own benefits. This gives them a bit more time to figure out the moats, which is useful because Groq was going places. This juices Groq’s distribution, production, ability to access a wider range of skills where necessary.
I expect China to want to compete with this. Simpler than full-blown Nvidia chips. Cue much cheaper and faster inference for all.
I think it’s pretty obvious at this point that Nvidia’s architecture has reached scaling limits - the power demands of their latest chips has Microsoft investing in nuclear fusion. Similar to Intel in both the pre-Core days and their more recent chips, they need an actual new architecture to move forward. As sits, there’s no path to profitability for the buyers of these chips given the cost and capabilities of the current LLM architectures, and this is obvious enough that even Nvidia has to realize it’s existential for them.
If Groq’s architecture can actually change the economics of inference and training sufficient to bring the costs in line with the actual, not speculative, benefits of LLMs, this may not be a buy-and-kill for Nvidia but something closer to Apple’s acquisition of P.A. Semi, which made the A- and M- class chips possible.
(Mind you, in Intel’s case they had to have their clocks cleaned by AMD a couple times to get them to see, but I think we’re further past the point of diminishing returns with Nvidia - I think they’re far enough past when the economics turned against them that Reality is their competition now.)
Will be interesting technically to see what develops from this. NVLink? Full CUDA feels maybe doubtful but who knows. Nvidia CUDA Tile feels like more of a maybe, their new much more explicit way of making workloads.
This does feel a bit sad for sure, worrying whether this might hold Groq and innovation back. Reciprocally, perhaps kind of cool to see Groq get a massive funding boost and help from a very experienced chip making peer. It feels like an envious position somewhat, even with the long term consequences being so hazy. From the outside yes it looks like Nvidia solidifying their iron grasp over a market with very limited competitive suppliers, but this could help Groq, and maybe it's not on the terms we think we want right now, but could be very cool to see.
I really hope some of the rest of the markets can see what's happening, broadly, with Nvidia forming partnerships all over the place. NVLink with Intel, NVLink with Amazon's Tritanium... there's much more to the ecosystem, but just connecting the chips smartly is a huge task, is core to inter-operation. And for all we've heard of CXL, UltraAccelerator Link (UALink) and UltraEthernet (UET) it feels like very few major players are taking it seriously enough to just integrate these new interconnects & make them awesome. They remain incredible expensive & not commonly used, lacking broad industry adoption, and reserved for very expensive systems: there's a huge existential risk here that (lack of) interconnect will destroy competitors ability to get their good chips well integrated and used. The rest of the market needs more clear alarm bells going off, and needs to be making sure good interconnect is available on way more chips, get it into everyone's hands ASAP not just big customers, so that adoption & Linux nerd type folks can start building stacks that open up the future. The market risks getting left behind, if NVLink is built in everywhere and the various other fabrics never become common-place.
[+] [-] impulser_|3 months ago|reply
It quite obvious that open source models are catching up to closed source models very fast they about 3-4 months behind right now, and yeah they are trained on Nvidia chips, but as the open source models become more usable, and closer to closed source models they will eat into Nvidia profit as these companies aren't spending tens of billion dollars on chips to train and run inference. These are smaller models trained on fewer GPUs and they are performing as good as the pervious OpenAI and Anthropic models.
So obviously open source models are a direct threat to Nvidia, and they only thing open source models struggle at is scaling inference and this is where Groq and Cerberus come into the picture as they provide the fastest inference for open source models that make them even more usable than SOTA models.
Maybe I'm way off on this.
[+] [-] Workaccount2|3 months ago|reply
[+] [-] Kiboneu|3 months ago|reply
The opposite, I think.
Why do you think that local models are a direct threat to Nvidia?
Why would Nvidia let a few of their large customers have more leverage by not diversifying to consumers? Openai decided to eat into Nvidia's manufacturing supply by buying DRAM; that's concretely threatening behavior from one of Nvidia's larger customers.
If Groq sells technology that allows for local models to be used better, why would that /not/ be a profit source for Nvidia to incorporate? Nvidia owes a lot of their success on the consumer market. This is a pattern in the history of computer tech development. Intel forgot this. AMD knows this. See where everyone is now.
Besides, there are going to be more Groqs in the future. Is it worth spending ~20B for each of them to continue to choke-hold the consumer market? Nvidia can afford to look further.
It'd be a lot harder to assume good faith if Openai ended up buying Groq. Maybe Nvidia knows this.
[+] [-] nl|3 months ago|reply
Almost all open source models are trained and mostly run on NVIDIA hardware.
Open source is great for NVIDIA. They want more open source, not less.
Commoditize your complement is business 101.
[+] [-] ilaksh|3 months ago|reply
[+] [-] PunchyHamster|3 months ago|reply
If anything more companies in making models business the higher NVIDIA chip demand will be, till we get some proper competition at least. We badly need some open CUDA equivalent so moving off to competition isn't a problem
[+] [-] heavyset_go|3 months ago|reply
[+] [-] ymck|3 months ago|reply
That said, it's completely anti-competitive. Nvidia could design a inference chip themselves, but instead the are locking down one of the only real independents. But... Nobody was saying Groq was making any real money. This might just be a rescue mission.
[+] [-] SkyPuncher|3 months ago|reply
[+] [-] ramoz|3 months ago|reply
[+] [-] karmasimida|3 months ago|reply
The open source economy relies on the wisdom of crowds. But that implies and equal access to experimentation platforms. The democratization of PC and consumer hardware brings the previous open source era that we all love, I am afraid the tech mongols had identified the chokehold of LLM ecosystem and found ways to successfully monopolized it
[+] [-] baconner|3 months ago|reply
A. The inevitable breakdown of their massive head start with CUDA and data center hardware. A serious competitor at real scale.
B. Anything that'll cool off the massive data center buildouts that are fueling them.
Seems clear that locking up a major potential competitor especially the minds behind it solves for A. And their ongoing machinations with circular funding of companies funding data centers is all about B - keeping the momentum before it fizzles.
[+] [-] nurettin|3 months ago|reply
> Maybe I'm way off on this.
If by open source, you mean downloadable from huggingface and SOTA you mean opus 4.5, yes you are way off.
[+] [-] epolanski|3 months ago|reply
The more competition, the more shovels they sell.
It's like saying that Intel would've benefited if only Dell and few others sold servers because they brought in multiple billions per year.
[+] [-] vachina|3 months ago|reply
[+] [-] AmazingTurtle|3 months ago|reply
[+] [-] mr_toad|3 months ago|reply
[+] [-] matthewfcarlson|3 months ago|reply
[+] [-] __mharrison__|3 months ago|reply
[+] [-] jayanmn|3 months ago|reply
[+] [-] nbardy|3 months ago|reply
Look at Nvidia nemotron series. They hav become a leading open source training lab themselves and they’re releasing the best training data, training tooling, and models at this point.
[+] [-] syntaxing|3 months ago|reply
[+] [-] mastax|3 months ago|reply
[+] [-] jedberg|3 months ago|reply
It should be noted that Don Jr. is one of the investors who will benefit greatly if/when this goes through.
[+] [-] dagmx|3 months ago|reply
[+] [-] unknown|3 months ago|reply
[deleted]
[+] [-] fancyfredbot|3 months ago|reply
I don't think it really helps Nvidia's competitive position. The serious competition to Nvidia is coming from Google's TPU, Amazon's Trainium, AMD's Instinct, and to a much lesser extent Intel's ARC.
Grow recent investors got back a 3x multiple and may now invest in one of Nvidia's other competitors instead.
[+] [-] bri3d|3 months ago|reply
[+] [-] nickysielicki|3 months ago|reply
> Today, Groq announced that it has entered into a non-exclusive licensing agreement with Nvidia for Groq’s inference technology. The agreement reflects a shared focus on expanding access to high-performance, low cost inference.
> As part of this agreement, Jonathan Ross, Groq’s Founder, Sunny Madra, Groq’s President, and other members of the Groq team will join Nvidia to help advance and scale the licensed technology.
> Groq will continue to operate as an independent company with Simon Edwards stepping into the role of Chief Executive Officer.
> GroqCloud will continue to operate without interruption.
[+] [-] gchadwick|3 months ago|reply
I wonder if equity holding employees get anything from the deal or indeed if all the investors will be seeing a return from this?
[+] [-] justincormack|3 months ago|reply
[+] [-] bossyTeacher|3 months ago|reply
A really strange agreement where top executives of a company "join" another company for the benefit of the other company.
If it quacks like a duck...
[+] [-] SilverElfin|3 months ago|reply
[+] [-] unknown|3 months ago|reply
[deleted]
[+] [-] nusl|3 months ago|reply
[+] [-] andrewinardeer|3 months ago|reply
NVIDIA isn't buying Groq.
It's a non exclusive deal for inference tech. Or am I reading it incorrectly?
[+] [-] joelthelion|3 months ago|reply
The world needs much stronger anti trust laws.
[+] [-] behnamoh|3 months ago|reply
[+] [-] maz1b|3 months ago|reply
[+] [-] yousif_123123|3 months ago|reply
Usually antitrust is for consumers, but here I think companies like Microsoft and AWS would be the biggest beneficiaries of having more AI chip competition.
[+] [-] kreyenborgi|3 months ago|reply
[+] [-] altpaddle|3 months ago|reply
[+] [-] supermatt|3 months ago|reply
The near exclusive global provider of AI chips taking key employees from and “licensing” the technology of the only serious competitor while quite specifically describing it as “not acquiring Groq as a company” seems quite obviously anti-competitive, and quite clearly an attempt to frame it as not.
[+] [-] dwa3592|3 months ago|reply
Can someone with better understanding dumb this down for me please?
[+] [-] Sayyidalijufri|3 months ago|reply
and he gave me this link:
https://groq.com/newsroom/groq-and-nvidia-enter-non-exclusiv...
https://www.cnbc.com/2025/12/24/nvidia-buying-ai-chip-startu...
[+] [-] richardw|3 months ago|reply
I expect China to want to compete with this. Simpler than full-blown Nvidia chips. Cue much cheaper and faster inference for all.
[+] [-] roughly|3 months ago|reply
I think it’s pretty obvious at this point that Nvidia’s architecture has reached scaling limits - the power demands of their latest chips has Microsoft investing in nuclear fusion. Similar to Intel in both the pre-Core days and their more recent chips, they need an actual new architecture to move forward. As sits, there’s no path to profitability for the buyers of these chips given the cost and capabilities of the current LLM architectures, and this is obvious enough that even Nvidia has to realize it’s existential for them.
If Groq’s architecture can actually change the economics of inference and training sufficient to bring the costs in line with the actual, not speculative, benefits of LLMs, this may not be a buy-and-kill for Nvidia but something closer to Apple’s acquisition of P.A. Semi, which made the A- and M- class chips possible.
(Mind you, in Intel’s case they had to have their clocks cleaned by AMD a couple times to get them to see, but I think we’re further past the point of diminishing returns with Nvidia - I think they’re far enough past when the economics turned against them that Reality is their competition now.)
[+] [-] jauntywundrkind|3 months ago|reply
This does feel a bit sad for sure, worrying whether this might hold Groq and innovation back. Reciprocally, perhaps kind of cool to see Groq get a massive funding boost and help from a very experienced chip making peer. It feels like an envious position somewhat, even with the long term consequences being so hazy. From the outside yes it looks like Nvidia solidifying their iron grasp over a market with very limited competitive suppliers, but this could help Groq, and maybe it's not on the terms we think we want right now, but could be very cool to see.
I really hope some of the rest of the markets can see what's happening, broadly, with Nvidia forming partnerships all over the place. NVLink with Intel, NVLink with Amazon's Tritanium... there's much more to the ecosystem, but just connecting the chips smartly is a huge task, is core to inter-operation. And for all we've heard of CXL, UltraAccelerator Link (UALink) and UltraEthernet (UET) it feels like very few major players are taking it seriously enough to just integrate these new interconnects & make them awesome. They remain incredible expensive & not commonly used, lacking broad industry adoption, and reserved for very expensive systems: there's a huge existential risk here that (lack of) interconnect will destroy competitors ability to get their good chips well integrated and used. The rest of the market needs more clear alarm bells going off, and needs to be making sure good interconnect is available on way more chips, get it into everyone's hands ASAP not just big customers, so that adoption & Linux nerd type folks can start building stacks that open up the future. The market risks getting left behind, if NVLink is built in everywhere and the various other fabrics never become common-place.
[+] [-] DebtDeflation|3 months ago|reply
[+] [-] nl|3 months ago|reply
Still this they should spin that out though!