>I might add that while Nvidia is often reviled for its GPU drivers being proprietary, they are still excellent drivers nonetheless.
Couldn't disagree more. For the last 4 to 5 months or so, Nvidia's drivers have been horrendous. I've experienced numerous crashes with my GTX 2080 Super card. About once every two days I experience a kernel panic and the only solution is to reboot. There are multiple threads on this issue on Nvidia's official Linux forum and Nvidia's driver developers have stated they don't know what the cause is and cannot reproduce it. [1]
My HDMI also doesn't work well after the computer wakes-up from sleep. Sometimes the monitor does work, sometimes it doesn't. It's also a known issue.
I'll be looking at AMD and their graphics cards a lot more closely when I build my next computer later this year. I just can't deal with Nvidia's driver issues. It's extremely annoying and it has wasted a ton of my time.
I've been using Nvidia's HW with GNU/Linux for about a decade and have never had any major issues and have always been happy with the performance and reliability. However, this is not the case anymore.
I'm pretty sure all this mess started when Kernel devs started blocking Nvidia's "GPL condom" effort [2].
When nVidia got what it wanted from Linux by dominating compute space using CUDA, they've just left visual part of their drivers to rot.
Moreover, they stay silent in the forums now. They don't listen to customer feedback or at least give any signs of hope or assurance. Their company image became too dominant and arrogant (I think it runs more than skin-deep).
Some of the nVidia's huge market share is coming from the momentum they have. Linux users do not refresh their hardware too frequently generally.
I'm running a GTX680 on my primary Linux box, but it won't be nVidia next time. I'll be returning back to red team and I'd be a happy camper.
When I was younger, I thought there was a choice between picking free software for ideological reasons, and picking whatever software did the job for pragmatic reasons. But I have mostly come to conclude over time is that this is a false choice; over a majority of use cases and time scales, open source is just pragmatic choice anyways.
I have the opposite experience. AMD is effectively unusable for AI. As much as I'd like to switch, I need those NVIDIA drivers because CUDA is the only thing that works reliably. So it's AMD CPU + NVIDIA GPU + Ubuntu sigh
I never had crashes but, having switched from a 1070 (proprietary) to a 6800 (open) this week, the experience is far better. Things that were previously nigh-on impossible to get right, like fixing screen tearing, or freesync without bizarre performance penalties, just work now.
Even in windows, nvidia drivers have been bad. One particularly nasty example is the SteamVR frame drop bug they introduced last... May? Still isn't fixed, though they finally posted they found the cause and the next patch will supposedly have a fix.
Even on Windows, the driver batch of 2017 tanked my card performances to the point even old games I used to play on max were stuttering. I hold on the 36x series as long as games let me, but came late 2018 new release started complaining and then stopped starting due the old driver
So I jumped on AMD, missed the rtx train, but my card is still running strong in 2021 and I'm not in a hurry to switch; I'll wait until the release of a midrange ready tracing card from them.
For game's yes but for Blender 3D's GPU accelerated Cycles rendering(Not CPU Cycles rendering) that relies on OpenCL that's not shipping in the Kernel.
So on some of the later HWE Kernels 5.8 for me that's not so good just yet for getting OpenCL working on AMD's Ryzen/Vega integrated graphics(Ryzen 5 3550H) and Radeon RX560X discrete mobile Graphics as well on the same laptop.
So Blender 3D currently on my laptop can not detect any OpenCL/Cycles compatible rendering enabled Graphics/GPU and will default to Cycles rendering on the CPU cores instead(Way Slower and ties the CPU cores/threads up 100% until the CPU rendering is finished).
So the MESA OpenGL and Vulkan drivers are in good shape but really any OpenCL compute acceleration on AMD's Ryzen APUs and on Kernel 5.8/Later and that's a wait and see. And Blender 3D, Gimp, Dark Table and other open source Graphics applications that need OpenCL will have to wait for that for me.
So Compelling for gaming maybe yes but compelling for GPGPU compute workloads that rely on OpenCL not so much compelling on the newer Razen APU based hardware and AMD's Instructions and not as polished as some of the Intel/Linux related instructions.
But really most laptop OEMs are still not tuning their laptops firmware for Linux except system76/other linux laptop OEMs! But that ASUS TUF FX505DY laptop was on special sale pre pandemic for $499 so not a bad deal but with loads of special issues to surmount just to get a Linux Mint 20.0 Live USB to even boot to install Mint 20 in a dual boot configuration alongside Windows 10 10/1909 Home.
But Blobs and out of tree Wifi drivers and that laptop's getting a Samsung M.2/NVMe to replace the WD Black M.2/NVMe that shipped with the laptop and the Realtek WiFi card sits right under the M.2/NVMe SSD slot so that's getting a more Linux Kernel out of the Box friendly WiFi card with better features as well.
That's huge and a big reason why my next GPU will be from AMD. If I can ever get one that is, between scarcity, demand, bots, scalpers, ...it is terrible out there. You can be on a stock watcher Discord and click instantly to still see that it is sold out (and at a huge markup too).
Yes - but for proper hardware support you often need to compile the latest Mesa-git. That's not super user friendly yet, but I expect it will get better down the road.
I don't understand how we're multiple years into a GPU shortage from both vendors. $400 retail price cards sell for $800, presumably from scalpers.
I read breathless reviews of how the latest card performs so well even though it's just $x MSRP and the reality makes that seem ridiculous. It shouldn't take 3 or 4 years to ramp up production that much. Yes, demand has increased, but again they've had years to react to that trend. I'd like to be able to buy a mediocre card for less than $500.
I think you are mixing up a few cycles into one. There hasn't been a GPU "shortage" for multiple years. But there are certainly shortages at different times in the past few years.
Nvidia had to manipulate their Financial Numbers to cover up for excessive stock sitting in Channel due to Bitcoin crash. It took them 3 quarters ( and more ) to clear those inventory out. AMD had similar situation but their impact were much smaller due to less percentage using it for mining.
This teach GPU vendor an important lesson. ( Edit: Imagine the worst case scenerio AMD has a new product launch and has ample of stocks while you have three quarter worth of old products not moving. )
Since then they are much more conservative with respect to forecast and planning. It is FAR better to have enough demand waiting for GPU than it is for GPU sitting in channel quite literally begging for their Distributors to sell them.
This conservativeness has a knock-on effect on TSMC planning. At the start of pandemic everyone thought the economy could take a tank, which means forecast and estimation were even lower. But reality is all of a sudden PC sales and GPU sales are hitting new height due to more Stay at home and Gaming. Bitcoin is pushing into new height so casual miners are back.
You are taking two conservative supply decision ( Previous over Supply and Pandemic ) and two Demand reality ( PC / Gaming and Bitcoin ). That is Four level of differences. Catching up takes time, and it doesn't help when the whole industry, not just GPU, but everything from WiFi, 5G Modem, SoC, CPU, Amazon Server GPU, NPU are all in demand. Both Intel and AMD grew 10% despite Intel having lower ASP ( i.e Higher Unit Sales ) and were selling as many as they could. The Silicon demand are far greater than even the most optimistic analysis has predicted.
I don't understand how we're multiple years into a GPU shortage from both vendors.
Why multiple years? It's unlikely the current shortage will take that long, and it only started at the end of last year. In Summer 2020 buying a gpu was not a problem. Then the RTX 3000 series launched with basically no cards actually being available, and all following releases were like that from both vendors.
But both the RTX 2000 series and the Radeon RX 5000 cards have been easily available back then (okay, iirc the USA had some supply issues, but that was domestic Covid and not production related), so there was no reason for them to ramp up production for now during the last 3 or 4 years.
Lots and lots of people built their first PC. Because being on lockdown and getting a stimulus check does that.
Demand has gone up, shortage even on PSU. There's hardly any shipping through air, because that is usually done in cargo bay of passenger planes. (with 5x increase in air shipping prices), so most must happen through ship, and all ships are full. There is also Chinese new year, where fabrics are closed for a while.
I do hope that this is a revival of the desktop PC!
When crypto bubble pops, there will be loads of used GPUs on the market and massive reduction of demand. I think that GPU manufacturers are afraid to scale because of that uncertainty. May be we need GPU futures.
I’m with you. This is a really strange situation. Unless these chips are really hard to manufacture for some reason or there’s some kind of component shortage, mobile manufacturers pump out billions of devices annually (meaning so does the supply chain pumping out the CPU/GPU SoCs). Maybe a node issue, but lasting multiple node sizes? If it’s crypto and persisted so long, then wouldn’t they have ramped up their pipelines? Is it that sales aren’t as good as they need them to be and artificial scarcity let’s them drive MSRP higher so that their revenues are a bit boosted? Maybe the flagships are way too complicated and expensive to manufacture with sales too few so that it basically amounts to nothing more than an expensive PR campaign? Maybe the real focus is primarily on cloud/mobile offerings? I really don’t know the details of the GPU business model so perhaps someone more knowledgeable can provide some insight. I’m really curious.
You should thank Bitcoin for the spike in prices. Everyone and their dog are buying a GPU to crunch some coins. Even mid-range cards are 70% up from where they were a month ago. Fucking lunacy.
I've been rocking Intel CPU and AMD GPU for close to 15 years. Built a i7 6700k with an RX 480 back in mid 2016. Just upgraded over the holidays and flipped.
AMD CPU for obvious reasons(4950x while I wait for the opportunity for a 5950x) and an RTX 3070 OTW(haven't been able to get a 3080).
I have been mostly satisfied with AMD drivers, but I have not owned some of the problem cards(like the Radeon VII). I do like the latest Catalyst and Ryzen Master software.. But the new crop of NVidia cards crush the AMD cards on ray tracing performance, and DLSS 2.0 is voodoo I want for 4k gaming and VR. Would really, really have liked to go with an AMD 6800 XT to support the underdog but.. Those two features are clutch when shelling out nearly a grand for a video card.
I have a good feeling about AMD in general and for their next GPU architecture iteration. They are supposedly working on an answer to DLSS(details are thin and NVidia are ML giants though) and this was their first generation with hardware ray tracing support so lessons learned should bear fruit.
For those of us doing any ML that requires CUDA/CuDNN. It feels that AMD (specifically for GPU) still isn't a viable option at this moment in time. I sincerely hope that this changes.
While I agree with the current status quo, I don't think that it is too hard to break the vendor lock-in here. Most ML people code against pytorch or tensorflow/keras APIs, so as long as AMD is a viable backend for these frameworks, people won't really care. But whether and when this will happen is another story.
I still find this surprising given how crappy my experience has been with recent AMD cards. I have a 12 Core ryzen which I love. And every time I try to go AMD for the video it’s been a disaster. I have an old RX 480 which seems to work pretty well on everything except Wayland, and then in Nvidia that I VFIO into windows.
I upgrade my gaming machine every 3-5 years and it's typically a full rebuild. What I noticed in the last ones is that switching CPU manufacturer is easy, since you switch cpu+motherboard and never expect to be able to upgrade (3 years later you want a new motherboard anyway even if you could theoretically upgrade).
But what creates inertia is my big expensive G-Sync screen which ties me to Nvidia. My monitor is the only expensive piece of hardware that survives between machines. And that's the genius of Nvidia's proprietary sync solution: So long as the cards are anywhere close, I'd of course choose the one that doesn't force me to switch screens. So far it has luckily not been an issue since AMD haven't upset Nvidia nearly in the way they upset intel. So Now I'm running a Ryzen and an RTX in the latest build.
NVDA also enabled freesync on their graphics cards, so realistically you can swap between the two without any issues. I really don't think NVDA will become complacent in the way INTC became. Nevertheless, the last generation of GPUs seemed to close the gap if we ignore DLSS and raytracing. In the former NVDA has a non trivial advantage, and that is their cards have been the backbone of DL research. It'd be interesting to see how the NVDA+Arm partnership continues.
If you own an Intel CPU, you are far, far more likely to have a Nvidia GPU as well.
I remember seeing that correlation many years ago, also on the Windows side, and likewise most people with an AMD CPU also had an ATI GPU. The acquisition of ATI by AMD seemed quite natural after that.
(I personally have the Intel+Nvidia combination, although it's very old, and I don't use Linux as my main OS.)
Despite the insane difficulty of actually buying amd parts for the last couple months its sort of crazy that they are still gaining market share like that.
I for one would upgrade my 5+ year old desktop, but I simply can't buy the parts (and I refuse to pay 2x MSRP to scalpers).
I went with dual AMD for a lot of the reasons discussed (ability to drop into my Proxmox server later) but for a more salient one to me: ray tracing.
Personally I did not think it would change that much gameplay-wise when I picked up my RX5700XT since there were so few games in the pipeline. I decided it was not worth the premium. I can still do most any game at 1440p and 100 fps or more without the headache. Why should I pay the premium for GPU features I will never use?
Intel/Nvidia here, starting with Windows and then exclusively Linux. Both CPU and GPU are from early 2015 (ready to upgrade my GPU first, but impossible to get anything these days, as I noted in my other comment). Those parts have been great, but ready for AMD (open source drivers, better VR support, competitive on performance/price).
I run Ubuntu with an AMD Ryzen CPU and Nvidia RTX GPU. I'm all for open/nonproprietary drivers, but RTX giving me +20% perf/$ in Blender rendering over Radeon is a dealbreaker.
I swapped to AMD for proc but on PC I still want NVIDIA, simply because their drivers are often extremely messed up. Nvidia has a lot of hacks but driver stability is still a huge thing
I bought the Radeon RX 5700 XT about a month after it came out and it wasn't supported by the latest linux kernel at the time, so I returned it and got the 2070 Super which worked out of the box just fine.
People say AMD cards are better if you're on linux, but I disagree, because it doesn't look like AMD is coordinating their product releases with linux kernel releases at all.
I feel like Intel is being dismissed much too quickly. I purchased 2 laptops in the last few months at about the same price, one having the latest-gen-available Ryzen and one having the latest-gen Intel. Here is a benchmark comparison of the chips:
The Zen 3 and M1 chips are unfairly being compared even in recent articles to 10th gen Intel chips when 11th gen is widely available. Considering there is an actual node change at the 11th gen it’s clear from the benchmarks that Intel has not actually fallen behind AMD —- they are neck-and-neck.
Considering the lack of disparity in benchmarks, though I am a long-time AMD fan (see my tongue in cheek username), I still think Intel is the better choice at this time for laptops. The reason is that the ACPI and general driver situation for Zen 2 and 3 is very bad. Power management and hibernation are totally unstable on Linux and Windows. I like my Zen 2 laptop for the better multithread capacity (slightly faster compile times!) but the Intel laptop is likely to last much better given that it will not destroy its own battery. The AMD machine discharges in a day even when hibernating, and in a matter of hours when sleeping.
Intel still has a chance to catch up. It all hinges on how fast they can move to 7nm. And given the “chip shortage” (i.e. limited capacity and likely yield/production issues at TSMC), Intel is well positioned to continue to dominate the market for quite a while by dint of having its own supply chain.
> On Linux, most AMD enthusiasts have probably started by adopting CPUs for quite a while since they fared better for parallel workloads – and now AMD is even catching up on single-thread performance with their latest processors.
I did the opposite, and started with AMD gpus, as those provided best bang for the buck in ~2008. Driver situation on Linux was quite dire, but it improved tremendously around 2013 (when Steam appeared on Linux).
Even though I was still advising AMD in the bulldozer era, Intel was winning performance-wise. On the GPU side, though, GPU drivers have been quite hassle-free since 2015, provided your GPU wasn't the latest and greatest. They got better at supporting their latest GPUs recently.
Intel and Nvidia always had despicable market segmentation and anti-consumer practices, which is the main reason I always went ATI/AMD.
Intel: No ECC on consumer CPUs, reduced number of PCI lanes, restricted frequency multiplier.
Nvidia: Signed GPU firmware, non-redistribuable. Driver limiting the use of GPUs in datacenters. Proprietary tech everywhere: G-sync, 3D vision, Physx. Artificially restricts the speed of nvidia libraries on non-nvidia hardware (Intel did that too, to a lesser extent). Cripples FP16 and FP64 on "consumer" hardware.
I recently got myself a HTC vive. It works fine, while not perfectly, on my R9 Fury under Linux. Go team red!
Having AMD GPU drivers in the mainline kernel makes me seriously consider selling my 2070 Super and getting and AMD. I'm tired of playing driver roulette with every kernel update. Crossing my fingers and hoping my setup won't break.
[+] [-] Jerry2|5 years ago|reply
Couldn't disagree more. For the last 4 to 5 months or so, Nvidia's drivers have been horrendous. I've experienced numerous crashes with my GTX 2080 Super card. About once every two days I experience a kernel panic and the only solution is to reboot. There are multiple threads on this issue on Nvidia's official Linux forum and Nvidia's driver developers have stated they don't know what the cause is and cannot reproduce it. [1]
My HDMI also doesn't work well after the computer wakes-up from sleep. Sometimes the monitor does work, sometimes it doesn't. It's also a known issue.
I'll be looking at AMD and their graphics cards a lot more closely when I build my next computer later this year. I just can't deal with Nvidia's driver issues. It's extremely annoying and it has wasted a ton of my time.
I've been using Nvidia's HW with GNU/Linux for about a decade and have never had any major issues and have always been happy with the performance and reliability. However, this is not the case anymore.
I'm pretty sure all this mess started when Kernel devs started blocking Nvidia's "GPL condom" effort [2].
[1] https://forums.developer.nvidia.com/c/gpu-unix-graphics/linu...
[2] https://www.phoronix.com/scan.php?page=news_item&px=Linux-Ke...*
[+] [-] bayindirh|5 years ago|reply
Moreover, they stay silent in the forums now. They don't listen to customer feedback or at least give any signs of hope or assurance. Their company image became too dominant and arrogant (I think it runs more than skin-deep).
Some of the nVidia's huge market share is coming from the momentum they have. Linux users do not refresh their hardware too frequently generally.
I'm running a GTX680 on my primary Linux box, but it won't be nVidia next time. I'll be returning back to red team and I'd be a happy camper.
[+] [-] yjftsjthsd-h|5 years ago|reply
[+] [-] fxtentacle|5 years ago|reply
[+] [-] samhh|5 years ago|reply
[+] [-] rrajasek|5 years ago|reply
I've had to test out several combinations before I got a relatively stable config.
Now I'm afraid to apply an automatic update since I don't want to lose several hours to debugging BS driver issues.
[+] [-] cybwraith|5 years ago|reply
[+] [-] devwastaken|5 years ago|reply
[+] [-] MawKKe|5 years ago|reply
[+] [-] avereveard|5 years ago|reply
So I jumped on AMD, missed the rtx train, but my card is still running strong in 2021 and I'm not in a hurry to switch; I'll wait until the release of a midrange ready tracing card from them.
[+] [-] shalmanese|5 years ago|reply
[+] [-] MrBuddyCasino|5 years ago|reply
Same with my trusty 1070. The WHQL driver from 2019 didn't crash, but it is now too old for current games.
[+] [-] volta83|5 years ago|reply
[+] [-] secondcoming|5 years ago|reply
GTFO with that BS.
[+] [-] anotherhue|5 years ago|reply
Imagine any other core component of modern systems requiring such extensive, closed source binary blobs?
The APU combos are particularly compelling for desktop Linux.
[+] [-] Farfignoggen|5 years ago|reply
So on some of the later HWE Kernels 5.8 for me that's not so good just yet for getting OpenCL working on AMD's Ryzen/Vega integrated graphics(Ryzen 5 3550H) and Radeon RX560X discrete mobile Graphics as well on the same laptop.
So Blender 3D currently on my laptop can not detect any OpenCL/Cycles compatible rendering enabled Graphics/GPU and will default to Cycles rendering on the CPU cores instead(Way Slower and ties the CPU cores/threads up 100% until the CPU rendering is finished).
So the MESA OpenGL and Vulkan drivers are in good shape but really any OpenCL compute acceleration on AMD's Ryzen APUs and on Kernel 5.8/Later and that's a wait and see. And Blender 3D, Gimp, Dark Table and other open source Graphics applications that need OpenCL will have to wait for that for me.
So Compelling for gaming maybe yes but compelling for GPGPU compute workloads that rely on OpenCL not so much compelling on the newer Razen APU based hardware and AMD's Instructions and not as polished as some of the Intel/Linux related instructions.
But really most laptop OEMs are still not tuning their laptops firmware for Linux except system76/other linux laptop OEMs! But that ASUS TUF FX505DY laptop was on special sale pre pandemic for $499 so not a bad deal but with loads of special issues to surmount just to get a Linux Mint 20.0 Live USB to even boot to install Mint 20 in a dual boot configuration alongside Windows 10 10/1909 Home.
But Blobs and out of tree Wifi drivers and that laptop's getting a Samsung M.2/NVMe to replace the WD Black M.2/NVMe that shipped with the laptop and the Realtek WiFi card sits right under the M.2/NVMe SSD slot so that's getting a more Linux Kernel out of the Box friendly WiFi card with better features as well.
[+] [-] podiki|5 years ago|reply
[+] [-] pjmlp|5 years ago|reply
Closed source blob, OpenGL 4.1 with video hardware acceleration.
Open source drivers, OpenGL 3.3 with a kind of working video hardware acceleration.
Windows drivers, still OpenGL 4.1 and DirectX 11.
So much for the benefit of open source drivers.
[+] [-] ekianjo|5 years ago|reply
[+] [-] whywhywhywhy|5 years ago|reply
Isn't there actually a worrying amount that do? Wifi for instance.
[+] [-] asddubs|5 years ago|reply
[+] [-] ineedasername|5 years ago|reply
I read breathless reviews of how the latest card performs so well even though it's just $x MSRP and the reality makes that seem ridiculous. It shouldn't take 3 or 4 years to ramp up production that much. Yes, demand has increased, but again they've had years to react to that trend. I'd like to be able to buy a mediocre card for less than $500.
[+] [-] ksec|5 years ago|reply
I think you are mixing up a few cycles into one. There hasn't been a GPU "shortage" for multiple years. But there are certainly shortages at different times in the past few years.
Nvidia had to manipulate their Financial Numbers to cover up for excessive stock sitting in Channel due to Bitcoin crash. It took them 3 quarters ( and more ) to clear those inventory out. AMD had similar situation but their impact were much smaller due to less percentage using it for mining.
This teach GPU vendor an important lesson. ( Edit: Imagine the worst case scenerio AMD has a new product launch and has ample of stocks while you have three quarter worth of old products not moving. )
Since then they are much more conservative with respect to forecast and planning. It is FAR better to have enough demand waiting for GPU than it is for GPU sitting in channel quite literally begging for their Distributors to sell them.
This conservativeness has a knock-on effect on TSMC planning. At the start of pandemic everyone thought the economy could take a tank, which means forecast and estimation were even lower. But reality is all of a sudden PC sales and GPU sales are hitting new height due to more Stay at home and Gaming. Bitcoin is pushing into new height so casual miners are back.
You are taking two conservative supply decision ( Previous over Supply and Pandemic ) and two Demand reality ( PC / Gaming and Bitcoin ). That is Four level of differences. Catching up takes time, and it doesn't help when the whole industry, not just GPU, but everything from WiFi, 5G Modem, SoC, CPU, Amazon Server GPU, NPU are all in demand. Both Intel and AMD grew 10% despite Intel having lower ASP ( i.e Higher Unit Sales ) and were selling as many as they could. The Silicon demand are far greater than even the most optimistic analysis has predicted.
[+] [-] onli|5 years ago|reply
Why multiple years? It's unlikely the current shortage will take that long, and it only started at the end of last year. In Summer 2020 buying a gpu was not a problem. Then the RTX 3000 series launched with basically no cards actually being available, and all following releases were like that from both vendors.
But both the RTX 2000 series and the Radeon RX 5000 cards have been easily available back then (okay, iirc the USA had some supply issues, but that was domestic Covid and not production related), so there was no reason for them to ramp up production for now during the last 3 or 4 years.
[+] [-] danybittel|5 years ago|reply
I do hope that this is a revival of the desktop PC!
[+] [-] vbezhenar|5 years ago|reply
[+] [-] vlovich123|5 years ago|reply
[+] [-] elorant|5 years ago|reply
[+] [-] Rapzid|5 years ago|reply
AMD CPU for obvious reasons(4950x while I wait for the opportunity for a 5950x) and an RTX 3070 OTW(haven't been able to get a 3080).
I have been mostly satisfied with AMD drivers, but I have not owned some of the problem cards(like the Radeon VII). I do like the latest Catalyst and Ryzen Master software.. But the new crop of NVidia cards crush the AMD cards on ray tracing performance, and DLSS 2.0 is voodoo I want for 4k gaming and VR. Would really, really have liked to go with an AMD 6800 XT to support the underdog but.. Those two features are clutch when shelling out nearly a grand for a video card.
I have a good feeling about AMD in general and for their next GPU architecture iteration. They are supposedly working on an answer to DLSS(details are thin and NVidia are ML giants though) and this was their first generation with hardware ray tracing support so lessons learned should bear fruit.
[+] [-] CraftThatBlock|5 years ago|reply
I'm definitely very excited for RDNA 3. I think AMD's GPU momentum is just starting, similar to the 1000s series.
[+] [-] robmsmt|5 years ago|reply
[+] [-] kavalg|5 years ago|reply
[+] [-] ekianjo|5 years ago|reply
[+] [-] InTheArena|5 years ago|reply
[+] [-] alkonaut|5 years ago|reply
But what creates inertia is my big expensive G-Sync screen which ties me to Nvidia. My monitor is the only expensive piece of hardware that survives between machines. And that's the genius of Nvidia's proprietary sync solution: So long as the cards are anywhere close, I'd of course choose the one that doesn't force me to switch screens. So far it has luckily not been an issue since AMD haven't upset Nvidia nearly in the way they upset intel. So Now I'm running a Ryzen and an RTX in the latest build.
[+] [-] PartiallyTyped|5 years ago|reply
[+] [-] userbinator|5 years ago|reply
I remember seeing that correlation many years ago, also on the Windows side, and likewise most people with an AMD CPU also had an ATI GPU. The acquisition of ATI by AMD seemed quite natural after that.
(I personally have the Intel+Nvidia combination, although it's very old, and I don't use Linux as my main OS.)
[+] [-] iJohnDoe|5 years ago|reply
I agree that NVIDIA QUADRO GPUs have an edge in stability.
The AMD FirePros are very nice cards and AMD actively works on improving the drivers. The RX 580 is also a really nice card.
NVIDIA dominates in the vGPU and compute side of things in the data center.
[+] [-] StillBored|5 years ago|reply
I for one would upgrade my 5+ year old desktop, but I simply can't buy the parts (and I refuse to pay 2x MSRP to scalpers).
[+] [-] gambiting|5 years ago|reply
[+] [-] hexadec|5 years ago|reply
Personally I did not think it would change that much gameplay-wise when I picked up my RX5700XT since there were so few games in the pipeline. I decided it was not worth the premium. I can still do most any game at 1440p and 100 fps or more without the headache. Why should I pay the premium for GPU features I will never use?
[+] [-] podiki|5 years ago|reply
[+] [-] shmerl|5 years ago|reply
* https://www.gamingonlinux.com/index.php?module=statistics&vi...
* https://www.gamingonlinux.com/index.php?module=statistics&vi...
AMD usage is rising on Linux and for obvious reasons.
[+] [-] mkaic|5 years ago|reply
[+] [-] taurath|5 years ago|reply
[+] [-] neverminder|5 years ago|reply
People say AMD cards are better if you're on linux, but I disagree, because it doesn't look like AMD is coordinating their product releases with linux kernel releases at all.
[+] [-] sempron64|5 years ago|reply
https://cpu.userbenchmark.com/Compare/Intel-Core-i5-1135G7-v...
The Zen 3 and M1 chips are unfairly being compared even in recent articles to 10th gen Intel chips when 11th gen is widely available. Considering there is an actual node change at the 11th gen it’s clear from the benchmarks that Intel has not actually fallen behind AMD —- they are neck-and-neck.
Considering the lack of disparity in benchmarks, though I am a long-time AMD fan (see my tongue in cheek username), I still think Intel is the better choice at this time for laptops. The reason is that the ACPI and general driver situation for Zen 2 and 3 is very bad. Power management and hibernation are totally unstable on Linux and Windows. I like my Zen 2 laptop for the better multithread capacity (slightly faster compile times!) but the Intel laptop is likely to last much better given that it will not destroy its own battery. The AMD machine discharges in a day even when hibernating, and in a matter of hours when sleeping.
Intel still has a chance to catch up. It all hinges on how fast they can move to 7nm. And given the “chip shortage” (i.e. limited capacity and likely yield/production issues at TSMC), Intel is well positioned to continue to dominate the market for quite a while by dint of having its own supply chain.
[+] [-] MayeulC|5 years ago|reply
> On Linux, most AMD enthusiasts have probably started by adopting CPUs for quite a while since they fared better for parallel workloads – and now AMD is even catching up on single-thread performance with their latest processors.
I did the opposite, and started with AMD gpus, as those provided best bang for the buck in ~2008. Driver situation on Linux was quite dire, but it improved tremendously around 2013 (when Steam appeared on Linux).
Even though I was still advising AMD in the bulldozer era, Intel was winning performance-wise. On the GPU side, though, GPU drivers have been quite hassle-free since 2015, provided your GPU wasn't the latest and greatest. They got better at supporting their latest GPUs recently.
Intel and Nvidia always had despicable market segmentation and anti-consumer practices, which is the main reason I always went ATI/AMD.
Intel: No ECC on consumer CPUs, reduced number of PCI lanes, restricted frequency multiplier.
Nvidia: Signed GPU firmware, non-redistribuable. Driver limiting the use of GPUs in datacenters. Proprietary tech everywhere: G-sync, 3D vision, Physx. Artificially restricts the speed of nvidia libraries on non-nvidia hardware (Intel did that too, to a lesser extent). Cripples FP16 and FP64 on "consumer" hardware.
I recently got myself a HTC vive. It works fine, while not perfectly, on my R9 Fury under Linux. Go team red!
[+] [-] jsiepkes|5 years ago|reply
[+] [-] unknown|5 years ago|reply
[deleted]