>Overall the Intel Arc Pro B50 was at 1.47x the performance of the NVIDIA RTX A1000 with that mix of OpenGL, Vulkan, and OpenCL/Vulkan compute workloads both synthetic and real-world tests. That is just under Intel's own reported Windows figures of the Arc Pro B50 delivering 1.6x the performance of the RTX A1000 for graphics and 1.7x the performance of the A1000 for AI inference. This is all the more impressive when considering the Arc Pro B50 price of $349+ compared to the NVIDIA RTX A1000 at $420+.
Comparing price to performance in this space might not make much sense as it would seem. One of the (very few) interesting qualities in the A1000 is that it's single slot, low profile, workstation GPU. Intel kept the "powered by the PCIe slot" aspect, but made it dual slot and full height. Needing a "workstation" GPU in a tiny form factor (i.e. not meant to slot and power full sized GPUs) was something one could squeeze on price for, but the only selling point of this is the price.
> 1.7x the performance of the A1000 for AI inference
That's a bold claim when their acceleration software (IPEX) is barely maintained and incompatible with most inference stacks, and their Vulkan driver is far behind it in performance.
Really confused why the Intel and AMD both continue to struggle and yet still refuse to offer what Nvidia wont, i.e. high ram consumer GPUs. I'd much prefer paying 3x cost for 3x VRAM (48GB/$1047), 6x cost for 6x VRAM (96GB/$2094), 12x cost for 12x VRAM (192GB/$4188), etc.
They'd sell like hotcakes and software support would quickly improve.
At 16GB I'd still prefer to pay a premium for NVidia GPUs given its superior ecosystem, I really want to get off NVidia but Intel/AMD isn't giving me any reason to.
Because the market of people who want huge RAM GPUs for home AI tinkering is basically about 3 Hacker News posters. Who probably won’t buy one because it doesn’t support CUDA.
PS5 has something like 16GB unified RAM, and no game is going to really push much beyond that in VRAM use, we don’t really get Crysis style system crushers anymore.
This card does have double the VRAM of the more expensive Nvidia competitor (the A1000, which has 8 GB), but I take your point that it doesn't feel like quite enough to justify giving up the Nvidia ecosystem. The memory bandwidth is also... not great.
They also announced a 24 GB B60 and a double-GPU version of the same (saves you physical slots), but it seems like they don't have a release date yet (?).
I am not sure there is significant enough market for those. That is selling enough consumer units to cover all design and other costs. From gamer perspective 16GB is now a reasonable point. 32GB is most one would really want and even that not at more than say 100 more price point.
This to me is the gamer perspective. This segment really does not need even 32GB, let alone 64GB or more.
I doubt you'd get linear scaling of price/capacity - the larger capacity modules are more expensive per GB than smaller ones, and in some cases are supply constrained.
The number of chips on the bus is usually pretty low (1 or 2 of them on most GPUs), so GPUs tend to have to scale out their memory bus widths to get to higher capacity. That's expensive and takes up die space, and for the conventional case (games) isn't generally needed on low end cards.
What really needs to happen is someone needs to make some "system seller" game that is incredibly popular and requires like 48GB of memory on the GPU to build demand. But then you have a chicken/egg problem.
Why not just buy 3 card then? These cards doesn't require active cooling anyways and you can just fit 3 in decent sized case. You will get 3x VRAM speed and 3x compute. And if your usecase is llm inference, it will be a lot faster than 1x card with 3x VRAM.
I think it's a bit of planned obsolescence as well. The 1080ti has been a monster with it's 11GB VRAM up until this generation. A lot of enthusiasts basically call out that Nvidia won't make that mistake again since it led to longer upgrade cycles.
The new CEO of Intel has said that Intel is giving up competing with Nvidia.
Why would you bother with any Intel product with an attitude like that, gives zero confidence in the company. What business is Intel in, if not competing with Nvidia and AMD. Is it giving up competing with AMD too?
The new CEO of Intel has said that Intel is giving up competing with Nvidia.
No, he said they're giving up competing against Nvidia in training. Instead, he said Intel will focus on inference.
That's the correct call in my opinion. Training is far more complex and will span multi data centers soon. Intel is too far behind. Inference is much simpler and likely a bigger market going forward.
AMD has also often said that they can't compete with Nvidia at the high end, and as the other commenter said: market segments exist. Not everyone needs a 5090. If anything, people are starved for options in the budget/mid-range market, which is where Intel could pick up a solid chunk of market share.
>What business is Intel in, if not competing with Nvidia and AMD.
Foundry business. The latest report on Discreet Graphics Market share Nvidia has 94%, AMD at 6% and Intel at 0%.
I may still have another 12 months to go. But in 2016 I made a bet against Intel engineers on Twitter and offline suggesting GPU is not a business they want to be in, or at least too late. They said at the time they will get 20% market share minimum by 2021. I said I would be happy if they did even 20% by 2026.
Intel is also losing money, they need cashflow to compete in Foundry business. I have long argued they should have cut off GPU segment when Pat Gelsinger arrives, turns out Intel bound themselves to GPU by all the government contract and supercomputer they promised to make. Now that they have delivered it all or mostly they will need to think about whether to continue or not.
Unfortunately unless US point guns at TSMC I just dont see how Intel will be able to compete, as Intel needs to be a leading edge position in order to command the margin required for Intel to function. Right now in terms of density Intel 18A is closer to TSMC N3 then N2.
I thought that he said that they gave up at competing with Nvidia at training, not in general. He left the door open to compete on inference. Did he say otherwise more recently?
A feature I haven't seen someone comment about yet is Project Battlematrix [1][2] with these cards, this allows for multi-GPU AI orchestration. A feature Nvidia offers for enterprise AI workloads (Run:ai), but Intel is bringing this to consumers
Huh, I didn't realize these were just released, I came across it looking for a GPU that had AV1 hardware encoding and been putting a shopping cart together for a mini-ITX xeon server for all my ffmpeg shenanigans.
I like to Buy American when I can but it's hard to find out which fabs various CPUs and GPUs are made in. I read Kingston does some RAM here and Crucial some SSDs. Maybe the silicon is fabbed here but everything I found is "assembled in Taiwan", which made me feel like I should get my dream machine sooner rather than later
I have the answer for you, Intel's GPU chips are on TSMC's process. They are not made in Intel-owned fabs.
There really is no such thing as "buying American" in the computer hardware industry unless you are talking about the designs rather than the assembly. There are also critical parts of the lithography process that depend on US technology, which is why the US is able to enforce certain sanctions (and due to some alliances with other countries that own the other parts of the process).
Personally I think people get way too worked up about being protectionist when it comes to global trade. We all want to buy our own country's products over others but we definitely wouldn't like it if other countries stopped buying our exported products.
When Apple sells an iPhone in China (and they sure buy a lot of them), Apple is making most of the money in that transaction by a large margin, and in turn so are you since your 401k is probably full of Apple stock, and so are the 60+% of Americans who invest in the stock market. A typical iPhone user will give Apple more money in profit from services than the profit from the sale of the actual device. The value is really not in the hardware assembly.
In the case of electronics products like this, almost the entire value add is in the design of the chip and the software that is running on it, which represents all the high-wage work, and a whole lot of that labor in the US.
US citizens really shouldn't envy a job where people are sitting at an electronics bench doing repetitive assembly work for 12 hours a day in a factory wishing we had more of those jobs in our country. They should instead be focused on making high level education more available/affordable so that they stay on top of the economic food chain, where most/all of its citizens are doing high-value work rather than causing education to be expensive and beg foreign manufacturers to open satellite factories to employ our uneducated masses.
I think the current wave of populist protectionist ideology is essentially blaming the wrong causes of declining affordability and increasing inequality for the working class. Essentially, people think that bringing the manufacturing jobs back and reversing globalism will right the ship on income inequality, but the reality is that the reason that equality was so good for Americans m in the mid-century was because the wealthy were taxed heavily, European manufacturing was decimated in WW2, and labor was in high demand.
The above of course is all my opinion on the situation, and a rather long tangent.
You may want to check that your Xeon may already support hardware encoding of AV1 in the iGPU. I saved a bundle building a media server when I realized the iGPU was more than sufficient (and more efficient) than chucking a GPU in the case.
I have a service that runs continuously and reencodes any videos I have into h265 and the iGPU barely even notices it.
Kinda bummed that it’s $50 more than originally said. But if it works well, a single slot card that can be powered by the PCIe slot is super valuable. Hoping there will be some affordable prebuilds so I can run some MoE LLM models.
GPUs prices really surprise me. Most PC part prices have remained the same over the decades with storage and RAM actually getting cheaper. GPUs however have gotten extremely expensive. $350 used to get you a really good GPU about 20 years ago, I think top of the line was around $450-500--now it only gets you entry level. Top of the line is now $1500+!
Datacenter gpu margins are 80%+. Consumer margins are like 25%. Any company with a datacenter product that sells out is just going to put all their fab allocation toward that and ignore the consumer segment. Plus these companies are really worried about their consumer products being used in datacenters and consuming their money maker so they kneecap the consumer vram to make sure that doesnt happen
I really think Intel is on the right track to dethrone both AMD and NVIDIA, while also competing with ARM SoCs. It's fascinating to watch.
Both their integrated and dedicated GPUs have been steadily improving each generation. The Arc line is both cheaper and comparable in performance to more premium NVIDIA cards. The 140T/140V iGPUs do the same to AMD APUs. Their upcoming Panther Lake and Nova Lake architectures seem promising, and will likely push this further. Meanwhile, they're also more power efficient and cooler, to the point where Apple's lead with their ARM SoCs is not far off. Sure, the software ecosystem is not up to par with the competition yet, but that's a much easier problem to solve, and they've been working on that front as well.
I'm holding off on buying a new laptop for a while just to see how this plays out. But I really like how Intel is shaking things up, and not allowing the established players to rest on their laurels.
It’s interesting that it uses 4 Display Ports and not a single HDMI.
Is HDMI seen as a “gaming” feature, or is DP seen as a “workstation” interface? Ultimately HDMI is a brand that commands higher royalties than DP, so I suspect this decision was largely chosen to minimize costs. I wonder what percentage of the target audience has HDMI only displays.
DP is perfectly fine for gaming (it's better than HDMI). The only reason HDMI is lingering around is the cartel which profits from patents on it, and manufacturers of TVs which stuff them with HDMI and don't provide DP or USB-C ports.
Otherwise HDMI would have been dead a long time ago.
Because you can actually fit 4 of them without impinging airflow from the heatsink. Mini HDMI is mechanically ass and I've never seen it anywhere but junky Android tablets.
DP also isn't proprietary.
There’s also weirdness with the drivers and hdmi, I think around encryption mainly. But if you only have DP and include an adapter, it’s suddenly “not my problem” from the perspective of Intel.
HDMI is shit. If you've never had problems with random machine hdmi port -> hdmi cable -> hdmi port on monitor you just haven't had enough monitors.
> Is HDMI seen as a “gaming” feature
It's a tv content protection feature. Sometimes it degrades the signal so you feel like you're watching tv. I've had this monitor/machine combination that identified my monitor as a tv over hdmi and switched to ycbcr just because it wanted to, with assorted color bleed on red text.
I am confused as a lot of comments here seem to argue around gaming, but isn't this supposed to be a workstation card, hence not intended to be used for games? The phoronix review also seems to only focus on computing usage, not gaming.
It's not competing with amd/nvidia at twice the price on terms of performance, but it's also too expensive for a cheap gaming rig. And then there are people who are happy with integrated graphics.
Maybe I'm just lacking imagination here, I don't do anything fancy on my work and couch laptops and I have a proper gaming PC.
Last time I had anything to do with the low-mid range pro GPU world, the use case was 3D CAD and certain animation tasks. That was ~10 years ago, though.
CAD, and medical were always the use case for high end workstations and professional GPUs. Companies designing jets and cars need more than iGPU, but they prefer slim desktops and something distanced from games.
An obvious use case is high-end NVRs. Low power, ample GPU for object detection/tracking, ample encoders for streaming. Should make a good surveillance platform.
With SR-IOV* there is a low cost path for GPU in virtual machines. Until now this has (mostly) been a feature exclusive to costly "enterprise" GPUs. Combine that with the good encoders and some VDI software and you have VM hosted GPU accelerated 3D graphics to remote displays. There are many business use cases for this, and no small number of "home lab" use cases as well.
Linux is a first class citizen with Intel's display products, and B50/60 is no different, so it's a nice choice when you want a GPU accelerated Linux desktop with minimum BS. Given the low cost and power, it could find its way into Steam consoles as well.
Finally, Intel is the scrappy competitor in this space: they are being very liberal with third parties and their designs, unlike the incumbents. We're already seeing this with Maxsun and others.
Another advantage of Intel GPU is vGPU SR-IOV, while consumer video cards of NVIDIA and AMD didn't support it. But even the integrated GPU of N100, N97 support it[1],
Therefore I can install Proxmox VE and run multiple VMs, assigning a vGPU to each of them a for video transcoding (IPCam NVR), AI and other applications.
I really hope Intel continues with GPUs or the GPU market is doomed until China catches up, Nvidia produces good products with great software, best in industry really, with great length support, but that doesn't excuse them from monopolistic practices. The fact that AMD refuses to compete really makes it look like this entire thing is organized from the top (US government).
This reminds me a lot of the LLM craze and how they wanted to charge so much for simple usage at the start until China released deepseek. Ideally we shouldn't rely on China but do we have a choice? the entire US economy has become reliant on monopolies to keep their insanely high stock prices and profit margins
If you buy Intel Arc cards for their competitive video encoding/decoding capabilities, it appears that all of them are still capped at 8 parallel streams. The "B" series have more headroom at high resolutions and bitrates, on the other hand some "A" series cards need only a single PCIe slot so you can stick more of them into a single server.
I'm glad Intel is continuing to make GPUs, really. But ultimately it seems like an uphill battle against a very entrenched monopoly with a software and community moat that was built up over nearly 20 years at this point. I wonder what it will take to break through.
It's half-height (fits in "slim" desktops, those media center PCs, and in a 2U server without having to turn it sideways/use a riser), and barely longer than the PCIe socket. Phoronix has a picture with a full-height bracket which maybe gives a better point of comparison: https://www.phoronix.com/review/intel-arc-pro-b50-linux
(A half-height single-slot card would be even smaller, but those are vanishingly rare these days. This is pretty much as small as GPUs get unless you're looking more for a "video adapter" than a GPU.)
Agreed. I have an A40 GPU in an epyc system right now specifically because it's a single slot card. I did not pay for gobs of PCIE expansion in this system just to block slots with double wide GPUs. Sure it can't do the heavy lift of some beefier cards but there is a need for single space cards still.
I think the answer to that right now is highly workload dependent. From what I have seen, it is improving rapidly, but still very early days for the software stack compared to Nvidia
It clocks in at 1503.4 samples per second, behind the NVidia RTX 2060 (1590.93 samples / sec, released Jan 2019), AMD Radeon RX 6750 XT (1539, May 2022), and Apple M3 Pro GPU 14 cores (1651.85, Oct 2023).
Note that this perf comparison is just ray-tracing rendering, useful for games, but might give some clarity on performance comparisons with its competition.
It wouldn't surprise me if there was 10-20% perf improvement in drivers/software for this. Intel's architecture is pretty new and nothing is optimized for it yet.
Intel is doing poorly, but I believe Apple was in much, much worse shape than this in the early 2000's. AMD was also in much, much worse shape that this.
Intel has many, many solid customers at the government, enterprise and consumer levels.
A $350 “workstation” GPU with 16 GB of VRAM? I... guess, but is that really enough for the kinds of things that would have you looking for workstation-level GPUs in the first place?
tester756|5 months ago
>Overall the Intel Arc Pro B50 was at 1.47x the performance of the NVIDIA RTX A1000 with that mix of OpenGL, Vulkan, and OpenCL/Vulkan compute workloads both synthetic and real-world tests. That is just under Intel's own reported Windows figures of the Arc Pro B50 delivering 1.6x the performance of the RTX A1000 for graphics and 1.7x the performance of the A1000 for AI inference. This is all the more impressive when considering the Arc Pro B50 price of $349+ compared to the NVIDIA RTX A1000 at $420+.
swiftcoder|5 months ago
I guess it's a boon for Intel that NVidia repeatedly shoots their own workstation GPUs in the foot...
zamadatix|5 months ago
bsder|5 months ago
With 16GB everybody will just call it another in the long list of Intel failures.
colechristensen|5 months ago
moffkalast|5 months ago
That's a bold claim when their acceleration software (IPEX) is barely maintained and incompatible with most inference stacks, and their Vulkan driver is far behind it in performance.
mythz|5 months ago
At 16GB I'd still prefer to pay a premium for NVidia GPUs given its superior ecosystem, I really want to get off NVidia but Intel/AMD isn't giving me any reason to.
fredoralive|5 months ago
PS5 has something like 16GB unified RAM, and no game is going to really push much beyond that in VRAM use, we don’t really get Crysis style system crushers anymore.
daemonologist|5 months ago
They also announced a 24 GB B60 and a double-GPU version of the same (saves you physical slots), but it seems like they don't have a release date yet (?).
cmxch|5 months ago
Ekaros|5 months ago
This to me is the gamer perspective. This segment really does not need even 32GB, let alone 64GB or more.
zdw|5 months ago
The number of chips on the bus is usually pretty low (1 or 2 of them on most GPUs), so GPUs tend to have to scale out their memory bus widths to get to higher capacity. That's expensive and takes up die space, and for the conventional case (games) isn't generally needed on low end cards.
What really needs to happen is someone needs to make some "system seller" game that is incredibly popular and requires like 48GB of memory on the GPU to build demand. But then you have a chicken/egg problem.
Example: https://wccftech.com/nvidia-geforce-rtx-5090-128-gb-memory-g...
YetAnotherNick|5 months ago
Why not just buy 3 card then? These cards doesn't require active cooling anyways and you can just fit 3 in decent sized case. You will get 3x VRAM speed and 3x compute. And if your usecase is llm inference, it will be a lot faster than 1x card with 3x VRAM.
0x500x79|5 months ago
akvadrako|5 months ago
kristopolous|5 months ago
doctorpangloss|5 months ago
wewewedxfgdf|5 months ago
Why would you bother with any Intel product with an attitude like that, gives zero confidence in the company. What business is Intel in, if not competing with Nvidia and AMD. Is it giving up competing with AMD too?
jlei523|5 months ago
That's the correct call in my opinion. Training is far more complex and will span multi data centers soon. Intel is too far behind. Inference is much simpler and likely a bigger market going forward.
SadTrombone|5 months ago
ksec|5 months ago
Foundry business. The latest report on Discreet Graphics Market share Nvidia has 94%, AMD at 6% and Intel at 0%.
I may still have another 12 months to go. But in 2016 I made a bet against Intel engineers on Twitter and offline suggesting GPU is not a business they want to be in, or at least too late. They said at the time they will get 20% market share minimum by 2021. I said I would be happy if they did even 20% by 2026.
Intel is also losing money, they need cashflow to compete in Foundry business. I have long argued they should have cut off GPU segment when Pat Gelsinger arrives, turns out Intel bound themselves to GPU by all the government contract and supercomputer they promised to make. Now that they have delivered it all or mostly they will need to think about whether to continue or not.
Unfortunately unless US point guns at TSMC I just dont see how Intel will be able to compete, as Intel needs to be a leading edge position in order to command the margin required for Intel to function. Right now in terms of density Intel 18A is closer to TSMC N3 then N2.
grg0|5 months ago
I want hardware that I can afford and own, not AI/datacenter crap that is useless to me.
ryao|5 months ago
mathnode|5 months ago
MangoToupe|5 months ago
ocdtrekkie|5 months ago
jasonfrost|5 months ago
high_na_euv|5 months ago
Alifatisk|5 months ago
1. https://youtu.be/iM58i3prTIU?si=JnErLQSHpxU-DlPP&t=225
2. https://www.intel.com/content/www/us/en/developer/articles/t...
jazzyjackson|5 months ago
I like to Buy American when I can but it's hard to find out which fabs various CPUs and GPUs are made in. I read Kingston does some RAM here and Crucial some SSDs. Maybe the silicon is fabbed here but everything I found is "assembled in Taiwan", which made me feel like I should get my dream machine sooner rather than later
dangus|5 months ago
There really is no such thing as "buying American" in the computer hardware industry unless you are talking about the designs rather than the assembly. There are also critical parts of the lithography process that depend on US technology, which is why the US is able to enforce certain sanctions (and due to some alliances with other countries that own the other parts of the process).
Personally I think people get way too worked up about being protectionist when it comes to global trade. We all want to buy our own country's products over others but we definitely wouldn't like it if other countries stopped buying our exported products.
When Apple sells an iPhone in China (and they sure buy a lot of them), Apple is making most of the money in that transaction by a large margin, and in turn so are you since your 401k is probably full of Apple stock, and so are the 60+% of Americans who invest in the stock market. A typical iPhone user will give Apple more money in profit from services than the profit from the sale of the actual device. The value is really not in the hardware assembly.
In the case of electronics products like this, almost the entire value add is in the design of the chip and the software that is running on it, which represents all the high-wage work, and a whole lot of that labor in the US.
US citizens really shouldn't envy a job where people are sitting at an electronics bench doing repetitive assembly work for 12 hours a day in a factory wishing we had more of those jobs in our country. They should instead be focused on making high level education more available/affordable so that they stay on top of the economic food chain, where most/all of its citizens are doing high-value work rather than causing education to be expensive and beg foreign manufacturers to open satellite factories to employ our uneducated masses.
I think the current wave of populist protectionist ideology is essentially blaming the wrong causes of declining affordability and increasing inequality for the working class. Essentially, people think that bringing the manufacturing jobs back and reversing globalism will right the ship on income inequality, but the reality is that the reason that equality was so good for Americans m in the mid-century was because the wealthy were taxed heavily, European manufacturing was decimated in WW2, and labor was in high demand.
The above of course is all my opinion on the situation, and a rather long tangent.
bane|5 months ago
I have a service that runs continuously and reencodes any videos I have into h265 and the iGPU barely even notices it.
Havoc|5 months ago
dale_glass|5 months ago
Also, do these support SR-IOV, as in handing slices of the GPU to virtual machines?
wqaatwt|5 months ago
cmxch|5 months ago
Havoc|5 months ago
syntaxing|5 months ago
jeffbee|5 months ago
mrheosuper|5 months ago
srmatto|5 months ago
HDThoreaun|5 months ago
jeroenhd|5 months ago
guluarte|5 months ago
imiric|5 months ago
Both their integrated and dedicated GPUs have been steadily improving each generation. The Arc line is both cheaper and comparable in performance to more premium NVIDIA cards. The 140T/140V iGPUs do the same to AMD APUs. Their upcoming Panther Lake and Nova Lake architectures seem promising, and will likely push this further. Meanwhile, they're also more power efficient and cooler, to the point where Apple's lead with their ARM SoCs is not far off. Sure, the software ecosystem is not up to par with the competition yet, but that's a much easier problem to solve, and they've been working on that front as well.
I'm holding off on buying a new laptop for a while just to see how this plays out. But I really like how Intel is shaking things up, and not allowing the established players to rest on their laurels.
bitmasher9|5 months ago
Is HDMI seen as a “gaming” feature, or is DP seen as a “workstation” interface? Ultimately HDMI is a brand that commands higher royalties than DP, so I suspect this decision was largely chosen to minimize costs. I wonder what percentage of the target audience has HDMI only displays.
Aurornis|5 months ago
Converting from DisplayPort to HDMI is trivial with a cheap adapter if necessary.
HDMI is mostly used on TVs and older monitors now.
cjbconnor|5 months ago
shmerl|5 months ago
Otherwise HDMI would have been dead a long time ago.
trueismywork|5 months ago
https://www.theregister.com/2024/03/02/hdmi_blocks_amd_foss/
amiga-workbench|5 months ago
dale_glass|5 months ago
KetoManx64|5 months ago
hamdingers|5 months ago
glitchc|5 months ago
mrinterweb|5 months ago
StrangeDoctor|5 months ago
emmelaich|5 months ago
nottorp|5 months ago
> Is HDMI seen as a “gaming” feature
It's a tv content protection feature. Sometimes it degrades the signal so you feel like you're watching tv. I've had this monitor/machine combination that identified my monitor as a tv over hdmi and switched to ycbcr just because it wanted to, with assorted color bleed on red text.
littlecranky67|5 months ago
wink|5 months ago
It's not competing with amd/nvidia at twice the price on terms of performance, but it's also too expensive for a cheap gaming rig. And then there are people who are happy with integrated graphics.
Maybe I'm just lacking imagination here, I don't do anything fancy on my work and couch laptops and I have a proper gaming PC.
ryukoposting|5 months ago
numpad0|5 months ago
sznio|5 months ago
hackerfoo|5 months ago
topspin|5 months ago
With SR-IOV* there is a low cost path for GPU in virtual machines. Until now this has (mostly) been a feature exclusive to costly "enterprise" GPUs. Combine that with the good encoders and some VDI software and you have VM hosted GPU accelerated 3D graphics to remote displays. There are many business use cases for this, and no small number of "home lab" use cases as well.
Linux is a first class citizen with Intel's display products, and B50/60 is no different, so it's a nice choice when you want a GPU accelerated Linux desktop with minimum BS. Given the low cost and power, it could find its way into Steam consoles as well.
Finally, Intel is the scrappy competitor in this space: they are being very liberal with third parties and their designs, unlike the incumbents. We're already seeing this with Maxsun and others.
* Intel has promised this for B50/60 in Q4
ytch|5 months ago
Therefore I can install Proxmox VE and run multiple VMs, assigning a vGPU to each of them a for video transcoding (IPCam NVR), AI and other applications.
https://github.com/Upinel/PVE-Intel-vGPU
dev1ycan|5 months ago
This reminds me a lot of the LLM craze and how they wanted to charge so much for simple usage at the start until China released deepseek. Ideally we shouldn't rely on China but do we have a choice? the entire US economy has become reliant on monopolies to keep their insanely high stock prices and profit margins
donkeybeer|5 months ago
jaggs|5 months ago
Tepix|5 months ago
kev009|5 months ago
bfrog|5 months ago
mixmastamyk|5 months ago
daemonologist|5 months ago
(A half-height single-slot card would be even smaller, but those are vanishingly rare these days. This is pretty much as small as GPUs get unless you're looking more for a "video adapter" than a GPU.)
lanthade|5 months ago
Tepix|5 months ago
All current Intel Flex cards seem to be based on the previous gen "Xe".
shrubble|5 months ago
adgjlsfhk1|5 months ago
addisonj|5 months ago
pshirshov|5 months ago
I would happily buy 96 Gb for $3490, but this makes very little sense.
jacquesm|5 months ago
samspenc|5 months ago
It clocks in at 1503.4 samples per second, behind the NVidia RTX 2060 (1590.93 samples / sec, released Jan 2019), AMD Radeon RX 6750 XT (1539, May 2022), and Apple M3 Pro GPU 14 cores (1651.85, Oct 2023).
Note that this perf comparison is just ray-tracing rendering, useful for games, but might give some clarity on performance comparisons with its competition.
adgjlsfhk1|5 months ago
bee_rider|5 months ago
Ekaros|5 months ago
sharts|5 months ago
arresin|5 months ago
solardev|5 months ago
altcognito|5 months ago
Intel has many, many solid customers at the government, enterprise and consumer levels.
They will be around.
nottorp|5 months ago
I have this cool and quiet fetish so 70 W is making me extremely interested. IF it also works as a gaming GPU.
iamleppert|5 months ago
DrNosferatu|5 months ago
mananaysiempre|5 months ago
adgjlsfhk1|5 months ago