top | item 22986922

AMD vs. Intel 2020: Who Makes the Best CPUs?

87 points| ItsTotallyOn | 6 years ago |tomshardware.com | reply

106 comments

order
[+] cptskippy|6 years ago|reply
I really hate these articles, they're click bait intended to sow discourse among fanbois and drive traffic to the site. People who have recently built systems read them and either feel validation or regret. That just spawns comments from people either trying to justify their loser choice or flaunt their lucky choice of the winner. The loyalists crawl out of wood work to make comments like "I still prefer x because y". These articles draw broad conclusions and influence perception across the board, even if they mention performance tiers, that brand X is always superior to brand Y.

I've been building systems for almost 30 years and in that time Intel has mostly been the performance champion AND has always screwed you over on price. Sometimes they lead by a lot, sometimes by very little or not at all but their platform is always more expensive.

Both Intel and AMD introduce a brand (e.g. Pentium, Ahtlon) as their new flagship and then overtime erode the brand each year with lower spec products. Eventually they stop introducing higher spec products altogether and the brand sinks to the bottom of their product lineup as a new brand takes the top ranks (e.g. i9, Ryzen).

Depending on when you get into building PCs you might have brand loyalty because maybe your first system was a Celeron you unlocked and ran at PIII speeds. Or maybe you picked up a first gen Althon that was the performance king for a few quarters, or the Athlon II when Intel was floundering with Netburst. Maybe you started out with Core and in your view Intel has always dominated AMD. Regardless your perceptions are skewed.

Identify your workload and your budget, buy the best thing that meets your needs regardless of brand. Intel is historically more stable and less problematic. AMD usually offers significantly more performance per $ on the low end. Then again that all takes too much effort, just buy whatever makes you feel good about yourself on the internet according to these click bait articles.

[+] neogodless|6 years ago|reply
Certainly if you think about it in terms of us vs. them, black and white, I don't see value.

If you're making purchasing decisions, you should be looking mostly at your specific needs, and the big picture around what you need and how to get good value for your money. If you can afford it, vote your conscious as well, and for a healthy future.

I've owned Cyrux, Pentium, Athlon, Core Duo, Phenom, Core i# and Ryzen. I'm partial to AMD as an underdog, and having a personality that always rooted for them. I'm unimpressed with Intel lately, but up until this year I still wouldn't buy an AMD laptop (though I've recommended a few for friends and family if the need and budget fit).

This article does try to paint "one answer for all questions" and of course that's silly. For most consumers I think they'll be better off this year with choosing Ryzen. But the Ryzen laptop chips that are impressing everyone seem to be in short supply. So I can't cheer them on just yet. I don't purchase for datacenters or professional rendering so I wouldn't dare weigh in there.

To sum up, focus on your needs, and individual options, not brand names!

[+] partingshots|6 years ago|reply
You’re a bit out of date here.

AMD has largely taken the performance crown thanks to their Zen CPU platform and move to using TSMC for their process as Intel has lagged behind, with major problems deriving from their inability to get 10nm working.

If it helps to give you some perspective on how such a drastic change could occur, Intel’s 10nm process was originally meant to be ready for release in 2014!, and yet they still haven’t managed to scale out production after six years of debugging and failure.

It’s a combination of AMD simultaneously doing extremely well and Intel severely stagnating that has allowed this to happen.

[+] scurvy|6 years ago|reply
The AMD Ryzen platform holds so much promise. There's just one problem: all of their consumer chipset motherboards are terrible. Really, really flakey and bad. If you're used to slapping in name-brand memory to a board and booting it up, LOL good luck with AMD! Your options for current gen Ryzens are:

x570 -> Requires active cooling (fans) on the chipset because it throws off too much heat. The proprietray fan will die and your motherboard needs replacing. Congrats! Also, the PCIe lane configurations are weird (x8 x8 x8).

x470 -> Don't even look at your memory the wrong way if you want it to boot. Also, you need an older gen Ryzen/Athlon to update the BIOS to something that works with Ryzen 3000. Sure, there are stickers all over the box touting 'Ryzen 3000 compatibility!' but that's only after you update the BIOS. The BIOS the board ships with won't work at all.

b450 -> Dicier to run new stuff on such an older platform. The implementations are supposedly power weak for higher core count 3900/3950 CPUs.

AMD really dropped the ball on these chipset. They're the things that are leading to terrible user experiences and why folks end up going back to Intel after countless dollars wasted on shipping, RMAs, and swapping gear. No motherboard manufacturer actually designs anything. They just bundle chips together and change logos in the BIOS. Some might have a "flashback" button that works on half the boards, but there's little to differentiate one from the other.

If you want something that just works and is fast, stick with Intel because their chipsets are better. I'm writing this on a 3900x, too...

EDIT: Since many are doubting what I went through, let me list the permutations of boards, DIMMs, and CPUs.

CPUs: 3900x and 220GE DIMMs: Corsair Ballistix, G.Skill FlareX, Samsung ECC (unbuffered) Boards: x470 Taichi and Hero VII

I finally got my third Taichi to boot with a 220GE to the point that I could flash a BIOS to something "stable" for the 3900x.

I've literally built hundreds of servers and PCs in my lifetime (including AMD Romes). I know what I'm doing. This ecosystem just isn't mature yet!

[+] Exmoor|6 years ago|reply
This is all pretty much FUD.

> x570 -> Requires active cooling (fans) on the chipset because it throws off too much heat. The proprietray fan will die and your motherboard needs replacing.

AMD does require fans on the chipsets for X570 boards, but different board makers have treated this differently. My understanding is that some boards don't spin it up unless you're putting some serious PCIE 4.0 bandwidth in. Either way, "The fan will die and needs replacing" may be true someday (as it would be with any fan), but is not a guarantee.

>x470 -> Don't even look at your memory the wrong way if you want it to boot. Also, you need an older gen Ryzen/Athlon to update the BIOS to something that works with Ryzen 3000. Sure, there are stickers all over the box touting 'Ryzen 3000 compatibility!' but that's only after you update the BIOS. The BIOS the board ships with won't work at all.

First gen Ryzen (1000 series chips) had some memory compatibility issues, but 2000 and 3000 chips haven't had many issues. It's true that you needed to do a BIOS upgrade on x470/B450 boards when 3000 series chips were released, but those stickers indicate that the board maker installed a newer BIOS and they should work out of the box. Unless you manage to find an X470 board that's been sitting on the shelf since before last July any x470 board you buy today is extremely likely to be compatible with all currently released Ryzen chips out of the box.

> b450 -> Dicier to run new stuff on such an older platform. The implementations are supposedly power weak for higher core count 3900/3950 CPUs.

3000 series chips use a smaller process and are more power efficient per-core than 2000 and 1000 series chips, so you should have no issue running an 8 core chip in a B450 board. 12-core and 16-core chips might have some issues, but who buys a $79 motherboard and sticks a $500/$900 CPU in it?

[+] neogodless|6 years ago|reply
Is this a thing? I know it's anecdotal but three friends and I all built systems with X470 chipsets (some 2700X and one with a 3900X) and none of us had any issues whatsoever. This is the first I've heard of anything beyond the need for cooling for the X570.
[+] zrm|6 years ago|reply
> Since many are doubting what I went through

It's not that people are doubting your experience, it's that the Intel boards aren't any better. Many of the Intel boards also need a BIOS update to support later CPUs than they were originally released with, when they support them at all. DDR4 memory is finicky as heck. People end up having to RMA Intel boards all the time.

It's largely caused by the motherboard makers losing the ability to differentiate themselves. Half of what used to be on the motherboard is now integrated into modern CPUs and the other half is in the chipset which is now also made by the CPU manufacturer. Two motherboards with the same chipset are largely fungible so they have to compete on price and the quality suffers. It's a general problem and buying any consumer motherboard these days is kind of a crapshoot.

Here's a motherboard with 5 eggs and 50 reviews:

https://www.newegg.com/p/N82E16813121180

It's a Pentium 4 motherboard from ~15 years ago. See if you can find a current generation Intel motherboard on there with 5 eggs from half that many reviews. I don't see any.

[+] na85|6 years ago|reply
Conversely, when I built a gaming rig last year I just slapped some Kingston memory, a Vega 64, and a Ryzen onto my x470 and everything Just Worked.

But then I'm not an overclocker and the only time I go into the BIOS is when something breaks and/or I need to change the boot order.

Maybe the quality control is what's at issue?

[+] jchw|6 years ago|reply
For anyone wondering about anecdotes to the contrary, I slapped random DDR4 DIMMs into my x470 no problem whatsoever. I heard of issues with original Ryzen, but not 3000, and you’re probably best ignoring the FUD and just going for it.

(FWIW: I’ve built 2 Ryzen 2000 and 2 Ryzen 3000 boxes, all on x470 and B450. No RAM issues.)

[+] jsgo|6 years ago|reply
Can't speak for anyone else, but I have a 570 board I think (MSI, I think, Phantom Gaming 4) with 2700x CPU. I have 4 sticks of 16GB ECC RAM (can't remember brand either, looked at the PG4's page where it lists compatible RAM) and they worked fine.

I don't know if the fact that I was looking at said recommended list is what negates the issue you're talking about, but I had minimal issues in my build (mine was from also including a PCIe to 8 SATA port card. Moved the main drive off of it, installed the drivers and started working fine). Initially tried to do a Windows VM w/ GeForce 1080 but scrapped it after driver issues with the device in VM (apparently, Nvidia doesn't want non-Quadra GPUs in VMs).

[+] nitsky|6 years ago|reply
I have had an X570 with a 3900X for a few months. Everything has worked perfectly so far.
[+] bitL|6 years ago|reply
Add in x399, so many problems I had with ASUS Zenith Extreme I didn't see since original Athlon days.
[+] LargoLasskhyfv|6 years ago|reply
I'm with you in parts. But for someone who built so many systems, the fan thing should be a non-issue. Just get a mechanically fitting replacement from somewhere, be it aliexpress, or some industrial supplier who serves end-users. The proprietary fans aren't magical pixie dust. Should be 5 to 15$ a piece, depending on the quality. Just don't source them from the usual PC-places. THOSE are ghettos, caring and catering about nothing else but shiny LED bling bling g

(The same applies to laptops, so far i've never came across one which hadn't a label or engraving that told me all i need to know for choosing the right replacement.)

[+] rasz|6 years ago|reply
chipsets arent even electrically connected to memory in modern computers, so umm?
[+] 2OEH8eoCRo0|6 years ago|reply
What are you on?

>If you're used to slapping in name-brand memory to a board and booting it up[...]

Yeah, I am and that's exactly what I did with a new Ryzen CPU. How is this shit the top comment?

[+] mindfulplay|6 years ago|reply
I am a little out of date on this topic but what effect does Spectre/Meltdown and other mitigations have on newer processors? Is it fundamentally just a software fix going forward? Seems like we are willing to sacrifice in the order of 10% performance at least in the previous generations....
[+] pstrateman|6 years ago|reply
For many workloads the effect was significantly more than 10%.

I had to disable mitigations to complete a filesystem checksum, it was on the order of 1000x slower.

[+] codecamper|6 years ago|reply
Given that amd has the lead in power consumption per performance, is it now the cpu of choice in the datacenter?
[+] madengr|6 years ago|reply
Some engineering software I use, uses Intel MPI libraries, and I suspect the math kernel also. The software runs 30% faster on an i7 versus a Ryzen 7, both with equivalent clock rate and number of cores.

Given the forthcoming i9 will hit 5.3 GHz, I’m choosing Intel for my next build. Until AMD comes out with libraries and compilers targeting scientific computing, Intel may still have the lead.

[+] ItsTotallyOn|6 years ago|reply
It certainly is in terms of power, performance, and cost, but Intel's reputation for rock-solid reliability and a decade of optimization for its architectures, not to mention the established software/support ecosystem, means AMD has a very long haul ahead of it.
[+] StreamBright|6 years ago|reply
Depending on the use case. Some hyper scalers won't buy hardware just because of the CPU has the lower power consumption. It would be also not very wise to bet only on one vendor.
[+] willtim|6 years ago|reply
Is AMD now a better choice for Linux, especially given the ongoing stability problems with Intel's Linux graphics drivers?
[+] m-p-3|6 years ago|reply
The drivers are quite good on newer kernel versions, it's been mostly pain-free for me
[+] Glyptodon|6 years ago|reply
When everything lines up AMD integrated graphics are very very pleasant on Linux. But you will also find random Kernel versions with obnoxious regressions and bugs you have to hunt down and fix or work around.
[+] celeritascelery|6 years ago|reply
For me the single biggest factor is still single threaded performance. There are still many apps that are not multithreaded so all those extra cores are useless. Single thread performance will benefit any application. Intel still holds the lead in this category.
[+] hddherman|6 years ago|reply
I don't like this argument for three reasons:

1. The single thread performance gap is very small (5-10% depending on application) or even non-existent in some cases.

2. Your computer runs more than one process at any given moment. The more cores, the more processes you can run without an impact on performance

3. That single core performance comes at the cost of efficiency and general power usage. Intel has had to clock their 14nm process very high to achieve that small lead in single thread performance.

[+] dannyw|6 years ago|reply
Intel holds a minuscule lead here, not enough to be worth thinking about; especially against the Zen 2 architecture (Ryzen 3000 series chips).

Wanting an Intel over a Zen 1 for single threaded performance? Completely understandable.

[+] djd20|6 years ago|reply
Marginally, and not really when perf/$ is taken into account...
[+] GhostVII|6 years ago|reply
Multi core performance is still useful if you are running many single-threaded applications at once.
[+] bob1029|6 years ago|reply
I initially felt similar regrets after purchasing my 2950X. Obviously, the advantage with this CPU is being able to go wide on all cores, not necessarily maxing out any particular one. And it does work in a lot of cases. I regularly see 100% CPU when running rebuilds of large .NET Core applications. But, I also recognize that on a single-thread basis, I don't really have any advantage over anyone else (potentially even a small deficit).

But, this is a narrow-minded perspective to have. The future cannot possibly be that we magically invent an x86 core that can clock to 8GHz w/ practical power & cooling. One has to push forward in some incremental way. The answer is adding more parallel execution units. Sure, lots of software and techniques used today are not feasible to directly port over, but that doesn't mean it has to stay that way.

As a software developer, I feel it is my duty to help push the state of the art forward wherever possible. I have already begun working on projects that are designed around the concept of extremely high core counts becoming the norm. Having 32 logical threads sitting in your local workstation makes it possible to practically experiment with the lower bounds of what would be required in a larger production deployment.

There are types of software that we can consider building today that would have sounded absolutely insane just a few years ago. Cloud gaming is now a thing falling within practical reach. How many more x86 cores per instance would we need before we could start thinking about rendering 100% of client views server-side (i.e. eliminate most HTML/CSS/JS) or deprecating the usefulness of discrete GPUs altogether? Pulling everything into a single memory domain could do wonders for breaking down the ages-old GPU/CPU functionality silos. One way to do it all with one instruction set and a single cache-coherent memory architecture...

[+] s9w|6 years ago|reply
Barely, if at all. Not all benchmarks have been done with all exploit mitigations we have today. Also AMD is cheaper per performance.
[+] daxfohl|6 years ago|reply
You could still say AMD has an advantage. Given the cost difference, and the fact that Intel usually changes socket types each release, you could purchase a new AMD SKU every year and always have the latest. For the same price you'd probably have to be working with three or four year old Intel technology at some point. By that time the newer AMD SKU would be putting the old Intel processor to shame.

Of course if price is no object then the above doesn't apply.

[+] grecy|6 years ago|reply
> There are still many apps that are not multithreaded

Can you give some examples of CPU intensive apps that are not multi threaded in 2020?

[+] rasz|6 years ago|reply
Stop using Python for time critical applications ;-)
[+] sedatk|6 years ago|reply
Docker is a thing now, though.