I'm half joking but if this AI boom continues we're going to see Nvidia exit from consumer GPU business. But Jensen Huang will never do that to us... (I hope)
Why would anyone sell a handful of GPUs to nobodies like us when they could sell a million GPUs for thousands apiece to a handful of big companies? We're speedrunning the absolute worst corpo cyberpunk timeline.
Might almost be a good thing, if it means abandoning overhyped/underperforming high-end game rendering tech, and taking things in a different direction.
The push for 4K with raytracing hasn't been a good thing, as it's pushed hardware costs way up and led to the attempts to fake it with AI upscaling and 'fake frames'. And even before that, the increased reliance on temporal antialiasing was becoming problematic.
The last decade or so of hardware/tech advances haven't really improved the games.
DLSS Transformer models are pretty good. Framegen can be useful but has niche applications dure to latency increase and artifacts. Global illumination can be amazing but also pretty niche as it's very expensive and comes with artifacts.
Biggest flop is UE5 and it's lumen/nanite. Reallly everything would be fine if not that crap.
And yeah, our hardware is not capable of proper raytracing at the moment.
The push for ray tracing comes from the fact that they've reached the practical limits of scaling more conventional rendering. RT performance is where we are seeing the most gen-on-gen performance improvement, across GPU vendors.
Poor RT performance is more a developer skill issue than a problem with the tech. We've had games like Doom The Dark Ages that flat out require RT, but the RT lighting pass only accounts for ~13% of frame times while pushing much better results than any raster GI solution would do with the same budget.
They are already making moves that might suggest that future. They are going to stop packaging VRAM with their GPUs shipped to third-party graphics card makers, who will have to source their own, probably at higher cost.
They will constrain supply before exiting. It's just not smart exiting, you can stop developing and it will be a trickle, also will work as insurance in case AI flops.
Honestly, I'd prefer it. It might get AMD and Intel more off their ass for GPU development. I already stopped buying Nvidia gpus ages ago before they saw value in the Linux/Unix market, and I'm tired of them sucking up all the air in the room.
They did get burned when crypto switched to dedicated hardware and nvida were left with for them huge surpluses of 10xx series hardware. But what they’re selling to AI companies now is a lot more different from their consumer gear
tomasphan|2 months ago
1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.
2. Its a market defense to keep other players down and keep them from growing their way into data centers.
3. Its profitable (probably the main reason but boring)
4. Hedge against data center volatility (10 key customers vs millions)
5. Antitrust defense (which they used when they tried to buy ARM)
BizarroLand|2 months ago
hyperbovine|2 months ago
No way that is true any more. Five years ago, maybe.
https://www.reddit.com/r/pcmasterrace/comments/1izlt9w/nvidi...
venturecruelty|2 months ago
kelnos|2 months ago
And they're not selling a handful of GPUs to nobodies like us; they're selling millions of GPUs to millions of nobodies.
CTDOCodebases|2 months ago
Instead we will be streaming games from our locked down tablets and paying a monthly subscription for the pleasure.
20after4|2 months ago
bluescrn|2 months ago
The push for 4K with raytracing hasn't been a good thing, as it's pushed hardware costs way up and led to the attempts to fake it with AI upscaling and 'fake frames'. And even before that, the increased reliance on temporal antialiasing was becoming problematic.
The last decade or so of hardware/tech advances haven't really improved the games.
whatevaa|2 months ago
Biggest flop is UE5 and it's lumen/nanite. Reallly everything would be fine if not that crap.
And yeah, our hardware is not capable of proper raytracing at the moment.
swinglock|2 months ago
babypuncher|2 months ago
Poor RT performance is more a developer skill issue than a problem with the tech. We've had games like Doom The Dark Ages that flat out require RT, but the RT lighting pass only accounts for ~13% of frame times while pushing much better results than any raster GI solution would do with the same budget.
bpye|2 months ago
pizlonator|2 months ago
goda90|2 months ago
whatevaa|2 months ago
leoc|2 months ago
officeplant|2 months ago
cmsj|2 months ago
Brainlag|2 months ago
Macha|2 months ago
adrr|2 months ago
pabs3|2 months ago
cmsj|2 months ago
gnabgib|2 months ago