top | item 28199067

Intel’s Arc GPUs will compete with GeForce and Radeon in early 2022

261 points| TangerineDream | 4 years ago |arstechnica.com

208 comments

order
[+] Exmoor|4 years ago|reply
As TFA rightly points out, unless something drastically changes in the next ~6mo, Intel is going to launch into the most favorable market situation we've seen in our lifetimes. Previously, the expectation is that they needed to introduce something that was competitive with the top end cards from nVidia and AMD. With basically all GPU's out of stock currently they really just need to introduce something competitive with the almost anything on the market to be able to sell as much as they can ship.
[+] 015a|4 years ago|reply
Yup; three other points I'd add:

1) I hate to say "year of desktop Linux" like every year, but with the Steam Deck release later this year, and Valve's commitment to continue investing and collaborating on Proton to ensure wide-range game support; Linux gaming is going to grow substantially throughout 2022, if only due to the new devices added by Steam Decks.

Intel has always had fantastic Linux video driver support. If Arc is competitive with the lowest end current-gen Nvidia/AMD cards (3060?), Linux gamers will love it. And, when thinking about Steam Deck 2 in 2022-2023, Intel becomes an option.

2) The current-gen Nvidia/AMD cards are insane. They're unbelievably powerful. But, here's the kicker: Steam Deck is 720p. You go out and buy a brand new Razer/Alienware/whatever gaming laptop, the most common resolution even on the high end models is 1080p (w/ high refresh rate). The Steam Hardware survey puts 1080p as the most common resolution, and ITS NOT EVEN REMOTELY CLOSE to #2 [1] (720p 8%, 1080p 67%, 1440p 8%, 4k 2%) (did you know more people use Steam on MacOS than on a 4k monitor? lol)

These Nvidia/AMD cards are unprecedented overkill for most gamers. People are begging for cards that can run games at 1080p, Nvidia went straight to 4K, even showing off 8K gaming on the 3090, and now they can't even deliver any cards that run 720p/1080p. Today, we've got AMD releasing the 6600XT, advertising it as a beast for 1080p gaming [2]. This is what people actually want; affordable and accessible cards to play games on (whether they can keep the 6600xt in stock remains to be seen, of course). Nvidia went straight Icarus with Ampere; they shot for the sun, and couldn't deliver.

3) More broadly, geopolitical pressure in east asia, and specifically taiwan, should be concerning investors in any company that relies heavily on TSMC (AMD & Apple being the two big ones). Intel may start by fabbing Arc there, but they uniquely have the capacity to bring that production to the west.

I am very, very long INTC.

[1] https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...

[2] https://www.pcmag.com/news/amd-unveils-the-radeon-rx-6600-xt...

[+] ayngg|4 years ago|reply
I thought they are using TSMC for their gpu, which means they will be part of the same bottleneck that is affecting everyone else.
[+] voidfunc|4 years ago|reply
Intel has the manufacturing capability to really beat up Nvidia. Even if the cards don’t perform like top-tier cards they could still win bigly here.

Very exciting!

[+] pier25|4 years ago|reply
Exactly. There are plenty of people that just want to upgrade an old GPU and anything modern would be a massive improvement.

I'm still rocking a 1070 for 1080p/60 gaming and would love to jump to 4K/60 gaming but just can't convince myself to buy a new GPU at current prices.

[+] chaosharmonic|4 years ago|reply
Given that timeline and their years of existing production history with Thunderbolt, Intel could also feasibly beat both of them to shipping USB4 on a graphics card.
[+] ineedasername|4 years ago|reply
Yep, basically anything capable of playing new games on lower setting at 720p, and > 3yr old games at better settings should be highly competitive in the low-end gaming market. Especially laptops where they might be a secondary machine for gamers with a high end desktop.
[+] moss2|4 years ago|reply
I thought the problem was a chip shortage. How is Intel going to solve this? Won't they just also run out of stock instantly?
[+] dheera|4 years ago|reply
> will compete with GeForce

> which performs a lot like the GDDR5 version of Nvidia's aging, low-end GeForce GTX 1030

Intel is trying to emulate what NVIDIA did a decade ago. Nobody in the NVIDIA world speaks of GeForce and GTX anymore, RTX is where it's at.

[+] ineedasername|4 years ago|reply
Yep, basically anything capable of playing new games at low setting and older > 3yr old games at better settings should be highly competitive in the low-end gaming market.
[+] hughrr|4 years ago|reply
GPU stock is rising and prices falling. It’s too late now.
[+] mhh__|4 years ago|reply
If they come out swinging here they could have the most deserved smugness in the industry for a good while. People have been rightly criticising them but wrongly writing them off.
[+] NonContro|4 years ago|reply
How long will that situation last though, with Ethereum 2.0 around the corner and the next difficulty bomb scheduled for December?

https://www.reddit.com/r/ethereum/comments/olla5w/eip_3554_o...

Intel could be launching their cards into a GPU surplus...

That's discrete GPUs though, presumably the major volumes are in laptop GPUs? Will Intel have a CPU+GPU combo product for laptops?

[+] jeswin|4 years ago|reply
If Intel provides as much Linux driver support as they do for their current integrated graphics lineup, we might have a new favourite among Linux users.
[+] r-bar|4 years ago|reply
They also seem to be the most willing to open up their GPU sharding API, GVTG, based on their work with their existing Xe GPUs. The performance of their implementation in their first generation was a bit underwhelming, but it seems like the intention is there.

If Intel is able to put out something reasonably competitive and that supports GPU sharding it could be a game changer. It could change the direction of the ecosystem and force Nvidia and AMD to bring sharding to their consumer tier cards. I am stoked to see where this new release takes us.

Level1Linux has a (reasonably) up to date state of the GPU ecosystem that does a much better job outlining the potential of this tech.

https://www.youtube.com/watch?v=IXUS1W7Ifys

[+] stormbrew|4 years ago|reply
This is the main reason I'm excited about this. I really hope they continue the very open approach they've used so far, but even if they start going binary blob for some of it like nvidia and (now to a lesser extent) amd have at least they're likely to properly implement KMS and other things because that's what they've been doing already.
[+] jogu|4 years ago|reply
Came here to say this. This will be especially interesting if there's better support for GPU virtualization to allow a Windows VM to leverage the card without passing the entire card through.
[+] kop316|4 years ago|reply
This was my thought too. If their linux driver support for this is as good as their integrated ones, I will be switching to Intel GPUs.
[+] heavyset_go|4 years ago|reply
Yep, their WiFi chips have good open source drivers on Linux, as well. It would be nice to have a GPU option that isn't AMD for open driver support on Linux.
[+] dcdc123|4 years ago|reply
A long time Linux graphics driver dev friend of mine was just hired by Intel.
[+] Nexxxeh|4 years ago|reply
With the Steam Deck coming out, running Linux in a high-profile gaming device with AMD graphics, hopefully it'll turn into a mild Linux GPU driver arms race. AMD and Valve are both working on improving Linux support for AMD hardware, GPU and CPU.
[+] holoduke|4 years ago|reply
Well. Every single AAA game is reflected in GPU drivers. I bet they need to work on windows drivers first. Sure they need to write tons of custom driver mods for hundreds of games.
[+] byefruit|4 years ago|reply
I really hope this breaks Nvidia's stranglehold on deep learning. Some competition would hopefully bring down prices at the compute high-end.

AMD don't seem to even be trying on the software-side at the moment. ROCm is a mess.

[+] at_a_remove|4 years ago|reply
I find myself needing, for the first time ever, a high-end video card for some heavy video encoding, and when I look, they're all gone, apparently in a tug of war between gamers and crypto miners.

At the exact same time, I am throwing out a box of old video cards from the mid-nineties (Trident, Diamond Stealth) and from the looks of it you can list them on eBay but they don't even sell.

Now Intel is about to leap into the fray and I am imagining trying to explain all of this to the me of twenty-five years back.

[+] cwizou|4 years ago|reply
They still are not saying with which part of that lineup they want to compete with, which is a good thing.

I still remember Pat Gelsinger telling us over and over that Larrabee would compete with the high end of the GeForce/Radeon offering back in the days, including when it was painfully obvious to everyone that it definitely would not.

https://en.wikipedia.org/wiki/Larrabee_(microarchitecture)

[+] tmccrary55|4 years ago|reply
I'm down if it comes with open drivers or specs.
[+] the8472|4 years ago|reply
If they support virtualization like they do on their iGPUs that would be great and possibly drive adoption by power users. But I suspect they'll use that feature for market segmentation just like AMD and Nvidia do.
[+] dragontamer|4 years ago|reply
https://software.intel.com/content/dam/develop/external/us/e...

The above is Intel's Gen11 architecture whitepaper, describing how Gen11 iGPUs work. I'd assume that their next-generation discrete GPUs will have a similar architecture (but no longer attached to CPU L3 cache).

I haven't really looked into Intel iGPU architecture at all. I see that the whitepaper has some oddities compared to AMD / NVidia GPUs. Its definitely "more different".

The SIMD-units are apparently only 4 x 32-bit wide (compared to 32-wide NVidia / RDNA or 64-wide CDNA). But they can be reconfigured to be 8x16-bit wide instead (a feature not really available on NVidia. AMD can do SIMD-inside-of-SIMD and split up its registers once again however, but its a fundamentally different mechanism).

--------

Branch divergence is likely to be less of an issue with narrower SIMD than its competitors. Well, in theory anyway.

[+] arcanus|4 years ago|reply
Always seems to be two years away, like the Aurora supercomputer at Argonne.
[+] stormbrew|4 years ago|reply
I know March 2020 has been a very very long month but I'm pretty sure we're gonna skip a bunch of calendar dates when we get out of it.
[+] re-actor|4 years ago|reply
Early 2022 is just 4 months away actually
[+] dubcanada|4 years ago|reply
Early 2022 is only like 4-8 months away?
[+] RicoElectrico|4 years ago|reply
Meanwhile overloading a name of an unrelated CPU architecture, incidentally used in older Intel Management Engines.
[+] fefe23|4 years ago|reply
The fact that the selling point most elaborated on in the press is the AI upscaling, I'm worried the rest of their architecture may not be up to snuff.
[+] jscipione|4 years ago|reply
I've been hearing Intel play this tune for years, time to show us something or change the record!
[+] andrewmcwatters|4 years ago|reply
Mostly unrelated, but I'm still amazed that if you bought Intel at the height of the Dot-com bubble and held on, you still wouldn't have broken even, even ignoring inflation.
[+] dleslie|4 years ago|reply
The sub-heading is false, I had a dedicated Intel GPU in 1998 by way of the i740.
[+] acdha|4 years ago|reply
Was that billed as a serious gaming GPU? I don't remember the i740 as anything other than a low-budget option.
[+] pjmlp|4 years ago|reply
I keep seeing such articles since Larrabe, better wait and see if this time it is actually any better.
[+] bifrost|4 years ago|reply
I'd be excited to see if you can run ARC on Intel ARC!

GPU Accelerated HN would be very interesting :)

[+] desktopninja|4 years ago|reply
3DFX is joining the party soon ... Matrox, you're up next
[+] jeffbee|4 years ago|reply
Interesting, but the add-in-card GPU market for graphics purposes is so small, it's hard to get worked up about it. The overwhelming majority of GPU units sold are IGPs. Intel owns virtually 100% of the computer (excluding mobile) IGP market and 70% of the total GPU market. You can get almost the performance of Intel's discrete GPUs with their latest IGPs in "Tiger Lake" generation parts. Intel can afford to nibble at the edges of the discrete GPU market because it costs them almost nothing to put a product out there and to a large extent they won the war already.
[+] selfhoster11|4 years ago|reply
You must be missing the gamer market that's positively starving for affordable dedicated GPUs.
[+] astockwell|4 years ago|reply
More promises tied --not to something in hand-- but to some amazing future thing. Intel has not learned one bit.
[+] tyingq|4 years ago|reply
"The earliest Arc products will be released in "the first quarter of 2022"

That implies they do have running prototypes in-hand.

[+] dkhenkin|4 years ago|reply
But what kind of hash rates will they get?! /s