As TFA rightly points out, unless something drastically changes in the next ~6mo, Intel is going to launch into the most favorable market situation we've seen in our lifetimes. Previously, the expectation is that they needed to introduce something that was competitive with the top end cards from nVidia and AMD. With basically all GPU's out of stock currently they really just need to introduce something competitive with the almost anything on the market to be able to sell as much as they can ship.
1) I hate to say "year of desktop Linux" like every year, but with the Steam Deck release later this year, and Valve's commitment to continue investing and collaborating on Proton to ensure wide-range game support; Linux gaming is going to grow substantially throughout 2022, if only due to the new devices added by Steam Decks.
Intel has always had fantastic Linux video driver support. If Arc is competitive with the lowest end current-gen Nvidia/AMD cards (3060?), Linux gamers will love it. And, when thinking about Steam Deck 2 in 2022-2023, Intel becomes an option.
2) The current-gen Nvidia/AMD cards are insane. They're unbelievably powerful. But, here's the kicker: Steam Deck is 720p. You go out and buy a brand new Razer/Alienware/whatever gaming laptop, the most common resolution even on the high end models is 1080p (w/ high refresh rate). The Steam Hardware survey puts 1080p as the most common resolution, and ITS NOT EVEN REMOTELY CLOSE to #2 [1] (720p 8%, 1080p 67%, 1440p 8%, 4k 2%) (did you know more people use Steam on MacOS than on a 4k monitor? lol)
These Nvidia/AMD cards are unprecedented overkill for most gamers. People are begging for cards that can run games at 1080p, Nvidia went straight to 4K, even showing off 8K gaming on the 3090, and now they can't even deliver any cards that run 720p/1080p. Today, we've got AMD releasing the 6600XT, advertising it as a beast for 1080p gaming [2]. This is what people actually want; affordable and accessible cards to play games on (whether they can keep the 6600xt in stock remains to be seen, of course). Nvidia went straight Icarus with Ampere; they shot for the sun, and couldn't deliver.
3) More broadly, geopolitical pressure in east asia, and specifically taiwan, should be concerning investors in any company that relies heavily on TSMC (AMD & Apple being the two big ones). Intel may start by fabbing Arc there, but they uniquely have the capacity to bring that production to the west.
Given that timeline and their years of existing production history with Thunderbolt, Intel could also feasibly beat both of them to shipping USB4 on a graphics card.
Yep, basically anything capable of playing new games on lower setting at 720p, and > 3yr old games at better settings should be highly competitive in the low-end gaming market. Especially laptops where they might be a secondary machine for gamers with a high end desktop.
Yep, basically anything capable of playing new games at low setting and older > 3yr old games at better settings should be highly competitive in the low-end gaming market.
If they come out swinging here they could have the most deserved smugness in the industry for a good while. People have been rightly criticising them but wrongly writing them off.
If Intel provides as much Linux driver support as they do for their current integrated graphics lineup, we might have a new favourite among Linux users.
They also seem to be the most willing to open up their GPU sharding API, GVTG, based on their work with their existing Xe GPUs. The performance of their implementation in their first generation was a bit underwhelming, but it seems like the intention is there.
If Intel is able to put out something reasonably competitive and that supports GPU sharding it could be a game changer. It could change the direction of the ecosystem and force Nvidia and AMD to bring sharding to their consumer tier cards. I am stoked to see where this new release takes us.
Level1Linux has a (reasonably) up to date state of the GPU ecosystem that does a much better job outlining the potential of this tech.
This is the main reason I'm excited about this. I really hope they continue the very open approach they've used so far, but even if they start going binary blob for some of it like nvidia and (now to a lesser extent) amd have at least they're likely to properly implement KMS and other things because that's what they've been doing already.
Came here to say this. This will be especially interesting if there's better support for GPU virtualization to allow a Windows VM to leverage the card without passing the entire card through.
Yep, their WiFi chips have good open source drivers on Linux, as well. It would be nice to have a GPU option that isn't AMD for open driver support on Linux.
With the Steam Deck coming out, running Linux in a high-profile gaming device with AMD graphics, hopefully it'll turn into a mild Linux GPU driver arms race. AMD and Valve are both working on improving Linux support for AMD hardware, GPU and CPU.
Well. Every single AAA game is reflected in GPU drivers. I bet they need to work on windows drivers first. Sure they need to write tons of custom driver mods for hundreds of games.
I find myself needing, for the first time ever, a high-end video card for some heavy video encoding, and when I look, they're all gone, apparently in a tug of war between gamers and crypto miners.
At the exact same time, I am throwing out a box of old video cards from the mid-nineties (Trident, Diamond Stealth) and from the looks of it you can list them on eBay but they don't even sell.
Now Intel is about to leap into the fray and I am imagining trying to explain all of this to the me of twenty-five years back.
They still are not saying with which part of that lineup they want to compete with, which is a good thing.
I still remember Pat Gelsinger telling us over and over that Larrabee would compete with the high end of the GeForce/Radeon offering back in the days, including when it was painfully obvious to everyone that it definitely would not.
If they support virtualization like they do on their iGPUs that would be great and possibly drive adoption by power users. But I suspect they'll use that feature for market segmentation just like AMD and Nvidia do.
The above is Intel's Gen11 architecture whitepaper, describing how Gen11 iGPUs work. I'd assume that their next-generation discrete GPUs will have a similar architecture (but no longer attached to CPU L3 cache).
I haven't really looked into Intel iGPU architecture at all. I see that the whitepaper has some oddities compared to AMD / NVidia GPUs. Its definitely "more different".
The SIMD-units are apparently only 4 x 32-bit wide (compared to 32-wide NVidia / RDNA or 64-wide CDNA). But they can be reconfigured to be 8x16-bit wide instead (a feature not really available on NVidia. AMD can do SIMD-inside-of-SIMD and split up its registers once again however, but its a fundamentally different mechanism).
--------
Branch divergence is likely to be less of an issue with narrower SIMD than its competitors. Well, in theory anyway.
The fact that the selling point most elaborated on in the press is the AI upscaling, I'm worried the rest of their architecture may not be up to snuff.
Mostly unrelated, but I'm still amazed that if you bought Intel at the height of the Dot-com bubble and held on, you still wouldn't have broken even, even ignoring inflation.
Interesting, but the add-in-card GPU market for graphics purposes is so small, it's hard to get worked up about it. The overwhelming majority of GPU units sold are IGPs. Intel owns virtually 100% of the computer (excluding mobile) IGP market and 70% of the total GPU market. You can get almost the performance of Intel's discrete GPUs with their latest IGPs in "Tiger Lake" generation parts. Intel can afford to nibble at the edges of the discrete GPU market because it costs them almost nothing to put a product out there and to a large extent they won the war already.
[+] [-] Exmoor|4 years ago|reply
[+] [-] 015a|4 years ago|reply
1) I hate to say "year of desktop Linux" like every year, but with the Steam Deck release later this year, and Valve's commitment to continue investing and collaborating on Proton to ensure wide-range game support; Linux gaming is going to grow substantially throughout 2022, if only due to the new devices added by Steam Decks.
Intel has always had fantastic Linux video driver support. If Arc is competitive with the lowest end current-gen Nvidia/AMD cards (3060?), Linux gamers will love it. And, when thinking about Steam Deck 2 in 2022-2023, Intel becomes an option.
2) The current-gen Nvidia/AMD cards are insane. They're unbelievably powerful. But, here's the kicker: Steam Deck is 720p. You go out and buy a brand new Razer/Alienware/whatever gaming laptop, the most common resolution even on the high end models is 1080p (w/ high refresh rate). The Steam Hardware survey puts 1080p as the most common resolution, and ITS NOT EVEN REMOTELY CLOSE to #2 [1] (720p 8%, 1080p 67%, 1440p 8%, 4k 2%) (did you know more people use Steam on MacOS than on a 4k monitor? lol)
These Nvidia/AMD cards are unprecedented overkill for most gamers. People are begging for cards that can run games at 1080p, Nvidia went straight to 4K, even showing off 8K gaming on the 3090, and now they can't even deliver any cards that run 720p/1080p. Today, we've got AMD releasing the 6600XT, advertising it as a beast for 1080p gaming [2]. This is what people actually want; affordable and accessible cards to play games on (whether they can keep the 6600xt in stock remains to be seen, of course). Nvidia went straight Icarus with Ampere; they shot for the sun, and couldn't deliver.
3) More broadly, geopolitical pressure in east asia, and specifically taiwan, should be concerning investors in any company that relies heavily on TSMC (AMD & Apple being the two big ones). Intel may start by fabbing Arc there, but they uniquely have the capacity to bring that production to the west.
I am very, very long INTC.
[1] https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...
[2] https://www.pcmag.com/news/amd-unveils-the-radeon-rx-6600-xt...
[+] [-] ayngg|4 years ago|reply
[+] [-] voidfunc|4 years ago|reply
Very exciting!
[+] [-] pier25|4 years ago|reply
I'm still rocking a 1070 for 1080p/60 gaming and would love to jump to 4K/60 gaming but just can't convince myself to buy a new GPU at current prices.
[+] [-] rasz|4 years ago|reply
https://www.youtube.com/watch?v=HSseaknEv9Q We Got an Intel GPU: Intel Iris Xe DG1 Video Card Review, Benchmarks, & Architecture
https://www.youtube.com/watch?v=uW4U6n-r3_0 Intel GPU A Real Threat: Adobe Premiere, Handbrake, & Production Benchmarks on DG1 Iris Xe
Its below GT1030 with a lot of issues.
[+] [-] chaosharmonic|4 years ago|reply
[+] [-] ineedasername|4 years ago|reply
[+] [-] moss2|4 years ago|reply
[+] [-] dheera|4 years ago|reply
> which performs a lot like the GDDR5 version of Nvidia's aging, low-end GeForce GTX 1030
Intel is trying to emulate what NVIDIA did a decade ago. Nobody in the NVIDIA world speaks of GeForce and GTX anymore, RTX is where it's at.
[+] [-] ineedasername|4 years ago|reply
[+] [-] hughrr|4 years ago|reply
[+] [-] mhh__|4 years ago|reply
[+] [-] NonContro|4 years ago|reply
https://www.reddit.com/r/ethereum/comments/olla5w/eip_3554_o...
Intel could be launching their cards into a GPU surplus...
That's discrete GPUs though, presumably the major volumes are in laptop GPUs? Will Intel have a CPU+GPU combo product for laptops?
[+] [-] jeswin|4 years ago|reply
[+] [-] r-bar|4 years ago|reply
If Intel is able to put out something reasonably competitive and that supports GPU sharding it could be a game changer. It could change the direction of the ecosystem and force Nvidia and AMD to bring sharding to their consumer tier cards. I am stoked to see where this new release takes us.
Level1Linux has a (reasonably) up to date state of the GPU ecosystem that does a much better job outlining the potential of this tech.
https://www.youtube.com/watch?v=IXUS1W7Ifys
[+] [-] stormbrew|4 years ago|reply
[+] [-] jogu|4 years ago|reply
[+] [-] kop316|4 years ago|reply
[+] [-] heavyset_go|4 years ago|reply
[+] [-] dcdc123|4 years ago|reply
[+] [-] Nexxxeh|4 years ago|reply
[+] [-] holoduke|4 years ago|reply
[+] [-] byefruit|4 years ago|reply
AMD don't seem to even be trying on the software-side at the moment. ROCm is a mess.
[+] [-] at_a_remove|4 years ago|reply
At the exact same time, I am throwing out a box of old video cards from the mid-nineties (Trident, Diamond Stealth) and from the looks of it you can list them on eBay but they don't even sell.
Now Intel is about to leap into the fray and I am imagining trying to explain all of this to the me of twenty-five years back.
[+] [-] cwizou|4 years ago|reply
I still remember Pat Gelsinger telling us over and over that Larrabee would compete with the high end of the GeForce/Radeon offering back in the days, including when it was painfully obvious to everyone that it definitely would not.
https://en.wikipedia.org/wiki/Larrabee_(microarchitecture)
[+] [-] judge2020|4 years ago|reply
[+] [-] tmccrary55|4 years ago|reply
[+] [-] TechieKid|4 years ago|reply
[+] [-] the8472|4 years ago|reply
[+] [-] dragontamer|4 years ago|reply
The above is Intel's Gen11 architecture whitepaper, describing how Gen11 iGPUs work. I'd assume that their next-generation discrete GPUs will have a similar architecture (but no longer attached to CPU L3 cache).
I haven't really looked into Intel iGPU architecture at all. I see that the whitepaper has some oddities compared to AMD / NVidia GPUs. Its definitely "more different".
The SIMD-units are apparently only 4 x 32-bit wide (compared to 32-wide NVidia / RDNA or 64-wide CDNA). But they can be reconfigured to be 8x16-bit wide instead (a feature not really available on NVidia. AMD can do SIMD-inside-of-SIMD and split up its registers once again however, but its a fundamentally different mechanism).
--------
Branch divergence is likely to be less of an issue with narrower SIMD than its competitors. Well, in theory anyway.
[+] [-] arcanus|4 years ago|reply
[+] [-] stormbrew|4 years ago|reply
[+] [-] re-actor|4 years ago|reply
[+] [-] dubcanada|4 years ago|reply
[+] [-] RicoElectrico|4 years ago|reply
[+] [-] fefe23|4 years ago|reply
[+] [-] jscipione|4 years ago|reply
[+] [-] andrewmcwatters|4 years ago|reply
[+] [-] dleslie|4 years ago|reply
[+] [-] acdha|4 years ago|reply
[+] [-] pjmlp|4 years ago|reply
[+] [-] bifrost|4 years ago|reply
GPU Accelerated HN would be very interesting :)
[+] [-] desktopninja|4 years ago|reply
[+] [-] jeffbee|4 years ago|reply
[+] [-] selfhoster11|4 years ago|reply
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] astockwell|4 years ago|reply
[+] [-] tyingq|4 years ago|reply
That implies they do have running prototypes in-hand.
[+] [-] dkhenkin|4 years ago|reply