top | item 45162626

Intel Arc Pro B50 GPU Launched at $349 for Compact Workstations

206 points| qwytw | 5 months ago |guru3d.com

267 comments

order

tester756|5 months ago

https://www.phoronix.com/review/intel-arc-pro-b50-linux

>Overall the Intel Arc Pro B50 was at 1.47x the performance of the NVIDIA RTX A1000 with that mix of OpenGL, Vulkan, and OpenCL/Vulkan compute workloads both synthetic and real-world tests. That is just under Intel's own reported Windows figures of the Arc Pro B50 delivering 1.6x the performance of the RTX A1000 for graphics and 1.7x the performance of the A1000 for AI inference. This is all the more impressive when considering the Arc Pro B50 price of $349+ compared to the NVIDIA RTX A1000 at $420+.

swiftcoder|5 months ago

IIRC, the RTX A1000 is an RTX 3050 8GB with ~10% of the shader cores disabled, retailing for double the price of a 3050?

I guess it's a boon for Intel that NVidia repeatedly shoots their own workstation GPUs in the foot...

zamadatix|5 months ago

Comparing price to performance in this space might not make much sense as it would seem. One of the (very few) interesting qualities in the A1000 is that it's single slot, low profile, workstation GPU. Intel kept the "powered by the PCIe slot" aspect, but made it dual slot and full height. Needing a "workstation" GPU in a tiny form factor (i.e. not meant to slot and power full sized GPUs) was something one could squeeze on price for, but the only selling point of this is the price.

bsder|5 months ago

Put 32GB on that card and everybody would ignore performance issues.

With 16GB everybody will just call it another in the long list of Intel failures.

colechristensen|5 months ago

#3 player just released something that compares well with price/performace ratio compared to #1 player's release from a year and a half ago... yep

moffkalast|5 months ago

> 1.7x the performance of the A1000 for AI inference

That's a bold claim when their acceleration software (IPEX) is barely maintained and incompatible with most inference stacks, and their Vulkan driver is far behind it in performance.

mythz|5 months ago

Really confused why the Intel and AMD both continue to struggle and yet still refuse to offer what Nvidia wont, i.e. high ram consumer GPUs. I'd much prefer paying 3x cost for 3x VRAM (48GB/$1047), 6x cost for 6x VRAM (96GB/$2094), 12x cost for 12x VRAM (192GB/$4188), etc. They'd sell like hotcakes and software support would quickly improve.

At 16GB I'd still prefer to pay a premium for NVidia GPUs given its superior ecosystem, I really want to get off NVidia but Intel/AMD isn't giving me any reason to.

fredoralive|5 months ago

Because the market of people who want huge RAM GPUs for home AI tinkering is basically about 3 Hacker News posters. Who probably won’t buy one because it doesn’t support CUDA.

PS5 has something like 16GB unified RAM, and no game is going to really push much beyond that in VRAM use, we don’t really get Crysis style system crushers anymore.

daemonologist|5 months ago

This card does have double the VRAM of the more expensive Nvidia competitor (the A1000, which has 8 GB), but I take your point that it doesn't feel like quite enough to justify giving up the Nvidia ecosystem. The memory bandwidth is also... not great.

They also announced a 24 GB B60 and a double-GPU version of the same (saves you physical slots), but it seems like they don't have a release date yet (?).

cmxch|5 months ago

Maxsun does offer a high VRAM (48GB) dual Arc Pro B60, but the only US availability has it on par with a 5090 at ~$3000.

Ekaros|5 months ago

I am not sure there is significant enough market for those. That is selling enough consumer units to cover all design and other costs. From gamer perspective 16GB is now a reasonable point. 32GB is most one would really want and even that not at more than say 100 more price point.

This to me is the gamer perspective. This segment really does not need even 32GB, let alone 64GB or more.

zdw|5 months ago

I doubt you'd get linear scaling of price/capacity - the larger capacity modules are more expensive per GB than smaller ones, and in some cases are supply constrained.

The number of chips on the bus is usually pretty low (1 or 2 of them on most GPUs), so GPUs tend to have to scale out their memory bus widths to get to higher capacity. That's expensive and takes up die space, and for the conventional case (games) isn't generally needed on low end cards.

What really needs to happen is someone needs to make some "system seller" game that is incredibly popular and requires like 48GB of memory on the GPU to build demand. But then you have a chicken/egg problem.

Example: https://wccftech.com/nvidia-geforce-rtx-5090-128-gb-memory-g...

YetAnotherNick|5 months ago

> I'd much prefer paying 3x cost for 3x VRAM

Why not just buy 3 card then? These cards doesn't require active cooling anyways and you can just fit 3 in decent sized case. You will get 3x VRAM speed and 3x compute. And if your usecase is llm inference, it will be a lot faster than 1x card with 3x VRAM.

0x500x79|5 months ago

I think it's a bit of planned obsolescence as well. The 1080ti has been a monster with it's 11GB VRAM up until this generation. A lot of enthusiasts basically call out that Nvidia won't make that mistake again since it led to longer upgrade cycles.

akvadrako|5 months ago

AMD Strix Halo has about 100GB of VRAM for around $1500 if that's all you care about.

kristopolous|5 months ago

You want an M3 ultra Mac studio

doctorpangloss|5 months ago

they don't manufacture RAM, so none of the margin goes to them

wewewedxfgdf|5 months ago

The new CEO of Intel has said that Intel is giving up competing with Nvidia.

Why would you bother with any Intel product with an attitude like that, gives zero confidence in the company. What business is Intel in, if not competing with Nvidia and AMD. Is it giving up competing with AMD too?

jlei523|5 months ago

  The new CEO of Intel has said that Intel is giving up competing with Nvidia.
No, he said they're giving up competing against Nvidia in training. Instead, he said Intel will focus on inference.

That's the correct call in my opinion. Training is far more complex and will span multi data centers soon. Intel is too far behind. Inference is much simpler and likely a bigger market going forward.

SadTrombone|5 months ago

AMD has also often said that they can't compete with Nvidia at the high end, and as the other commenter said: market segments exist. Not everyone needs a 5090. If anything, people are starved for options in the budget/mid-range market, which is where Intel could pick up a solid chunk of market share.

ksec|5 months ago

>What business is Intel in, if not competing with Nvidia and AMD.

Foundry business. The latest report on Discreet Graphics Market share Nvidia has 94%, AMD at 6% and Intel at 0%.

I may still have another 12 months to go. But in 2016 I made a bet against Intel engineers on Twitter and offline suggesting GPU is not a business they want to be in, or at least too late. They said at the time they will get 20% market share minimum by 2021. I said I would be happy if they did even 20% by 2026.

Intel is also losing money, they need cashflow to compete in Foundry business. I have long argued they should have cut off GPU segment when Pat Gelsinger arrives, turns out Intel bound themselves to GPU by all the government contract and supercomputer they promised to make. Now that they have delivered it all or mostly they will need to think about whether to continue or not.

Unfortunately unless US point guns at TSMC I just dont see how Intel will be able to compete, as Intel needs to be a leading edge position in order to command the margin required for Intel to function. Right now in terms of density Intel 18A is closer to TSMC N3 then N2.

grg0|5 months ago

Zero confidence why? Market segments exist.

I want hardware that I can afford and own, not AI/datacenter crap that is useless to me.

ryao|5 months ago

I thought that he said that they gave up at competing with Nvidia at training, not in general. He left the door open to compete on inference. Did he say otherwise more recently?

mathnode|5 months ago

Because we don't need data centre hardware to run domestic software.

MangoToupe|5 months ago

I don't really want an nvidia gpu; it's too expensive and I won't use most of it. This actually looks attractive.

ocdtrekkie|5 months ago

NVIDIA cards are unironically over $3,500 at the store in some cases...

jasonfrost|5 months ago

Isn't Intel the only largely domestic fab

jazzyjackson|5 months ago

Huh, I didn't realize these were just released, I came across it looking for a GPU that had AV1 hardware encoding and been putting a shopping cart together for a mini-ITX xeon server for all my ffmpeg shenanigans.

I like to Buy American when I can but it's hard to find out which fabs various CPUs and GPUs are made in. I read Kingston does some RAM here and Crucial some SSDs. Maybe the silicon is fabbed here but everything I found is "assembled in Taiwan", which made me feel like I should get my dream machine sooner rather than later

dangus|5 months ago

I have the answer for you, Intel's GPU chips are on TSMC's process. They are not made in Intel-owned fabs.

There really is no such thing as "buying American" in the computer hardware industry unless you are talking about the designs rather than the assembly. There are also critical parts of the lithography process that depend on US technology, which is why the US is able to enforce certain sanctions (and due to some alliances with other countries that own the other parts of the process).

Personally I think people get way too worked up about being protectionist when it comes to global trade. We all want to buy our own country's products over others but we definitely wouldn't like it if other countries stopped buying our exported products.

When Apple sells an iPhone in China (and they sure buy a lot of them), Apple is making most of the money in that transaction by a large margin, and in turn so are you since your 401k is probably full of Apple stock, and so are the 60+% of Americans who invest in the stock market. A typical iPhone user will give Apple more money in profit from services than the profit from the sale of the actual device. The value is really not in the hardware assembly.

In the case of electronics products like this, almost the entire value add is in the design of the chip and the software that is running on it, which represents all the high-wage work, and a whole lot of that labor in the US.

US citizens really shouldn't envy a job where people are sitting at an electronics bench doing repetitive assembly work for 12 hours a day in a factory wishing we had more of those jobs in our country. They should instead be focused on making high level education more available/affordable so that they stay on top of the economic food chain, where most/all of its citizens are doing high-value work rather than causing education to be expensive and beg foreign manufacturers to open satellite factories to employ our uneducated masses.

I think the current wave of populist protectionist ideology is essentially blaming the wrong causes of declining affordability and increasing inequality for the working class. Essentially, people think that bringing the manufacturing jobs back and reversing globalism will right the ship on income inequality, but the reality is that the reason that equality was so good for Americans m in the mid-century was because the wealthy were taxed heavily, European manufacturing was decimated in WW2, and labor was in high demand.

The above of course is all my opinion on the situation, and a rather long tangent.

bane|5 months ago

You may want to check that your Xeon may already support hardware encoding of AV1 in the iGPU. I saved a bundle building a media server when I realized the iGPU was more than sufficient (and more efficient) than chucking a GPU in the case.

I have a service that runs continuously and reencodes any videos I have into h265 and the iGPU barely even notices it.

Havoc|5 months ago

If you don't need it for AI shenanigans then you're better off with the smaller arcs for under a 100...they can do av1 too

dale_glass|5 months ago

What about the B60, with the 24GB VRAM?

Also, do these support SR-IOV, as in handing slices of the GPU to virtual machines?

wqaatwt|5 months ago

SR-IOV is allegedly coming in the future (just like the b60).

cmxch|5 months ago

It’s sort of out there but being scalped by AIBs.

Havoc|5 months ago

Good pricing for 16gb vram. Can see that finding a use for some home servers.

syntaxing|5 months ago

Kinda bummed that it’s $50 more than originally said. But if it works well, a single slot card that can be powered by the PCIe slot is super valuable. Hoping there will be some affordable prebuilds so I can run some MoE LLM models.

jeffbee|5 months ago

Competing workstation cards like the RTX A2000 also do not need power connectors.

mrheosuper|5 months ago

Am i missing anything ?, because it looks like a double slot GPU.

srmatto|5 months ago

GPUs prices really surprise me. Most PC part prices have remained the same over the decades with storage and RAM actually getting cheaper. GPUs however have gotten extremely expensive. $350 used to get you a really good GPU about 20 years ago, I think top of the line was around $450-500--now it only gets you entry level. Top of the line is now $1500+!

HDThoreaun|5 months ago

Datacenter gpu margins are 80%+. Consumer margins are like 25%. Any company with a datacenter product that sells out is just going to put all their fab allocation toward that and ignore the consumer segment. Plus these companies are really worried about their consumer products being used in datacenters and consuming their money maker so they kneecap the consumer vram to make sure that doesnt happen

jeroenhd|5 months ago

Cryptocurrency, NFTs, and AI ruined the GPU market.

guluarte|5 months ago

all thanks to crypto and AI

imiric|5 months ago

I really think Intel is on the right track to dethrone both AMD and NVIDIA, while also competing with ARM SoCs. It's fascinating to watch.

Both their integrated and dedicated GPUs have been steadily improving each generation. The Arc line is both cheaper and comparable in performance to more premium NVIDIA cards. The 140T/140V iGPUs do the same to AMD APUs. Their upcoming Panther Lake and Nova Lake architectures seem promising, and will likely push this further. Meanwhile, they're also more power efficient and cooler, to the point where Apple's lead with their ARM SoCs is not far off. Sure, the software ecosystem is not up to par with the competition yet, but that's a much easier problem to solve, and they've been working on that front as well.

I'm holding off on buying a new laptop for a while just to see how this plays out. But I really like how Intel is shaking things up, and not allowing the established players to rest on their laurels.

bitmasher9|5 months ago

It’s interesting that it uses 4 Display Ports and not a single HDMI.

Is HDMI seen as a “gaming” feature, or is DP seen as a “workstation” interface? Ultimately HDMI is a brand that commands higher royalties than DP, so I suspect this decision was largely chosen to minimize costs. I wonder what percentage of the target audience has HDMI only displays.

Aurornis|5 months ago

DisplayPort is the superior option for monitors. High end gaming monitors will have DisplayPort inputs.

Converting from DisplayPort to HDMI is trivial with a cheap adapter if necessary.

HDMI is mostly used on TVs and older monitors now.

cjbconnor|5 months ago

4x Mini DP is common for low profile workstation cards, see the Quadro P1000, T1000, Radeon Pro WX 4100, etc.

shmerl|5 months ago

DP is perfectly fine for gaming (it's better than HDMI). The only reason HDMI is lingering around is the cartel which profits from patents on it, and manufacturers of TVs which stuff them with HDMI and don't provide DP or USB-C ports.

Otherwise HDMI would have been dead a long time ago.

amiga-workbench|5 months ago

Because you can actually fit 4 of them without impinging airflow from the heatsink. Mini HDMI is mechanically ass and I've never seen it anywhere but junky Android tablets. DP also isn't proprietary.

dale_glass|5 months ago

HDMI requires paying license fees. DP is an open standard.

KetoManx64|5 months ago

There are inexpensive ($10ish) converters that do DP > HDMI, but the inverse is much more expensive ($50-100)

hamdingers|5 months ago

Can't fit 4 of anything else in a half height bracket.

glitchc|5 months ago

The latest DP standard has higher bandwidth and can support higher framerates at the same resolution.

mrinterweb|5 months ago

DisplayPort is the interface most gaming monitors use. If you don't need ARC or CEC, mostly used for home theater builds, DisplayPort is preferable.

StrangeDoctor|5 months ago

There’s also weirdness with the drivers and hdmi, I think around encryption mainly. But if you only have DP and include an adapter, it’s suddenly “not my problem” from the perspective of Intel.

emmelaich|5 months ago

My monitor does 165Hz with DP, only 150 with HDMI.

nottorp|5 months ago

HDMI is shit. If you've never had problems with random machine hdmi port -> hdmi cable -> hdmi port on monitor you just haven't had enough monitors.

> Is HDMI seen as a “gaming” feature

It's a tv content protection feature. Sometimes it degrades the signal so you feel like you're watching tv. I've had this monitor/machine combination that identified my monitor as a tv over hdmi and switched to ycbcr just because it wanted to, with assorted color bleed on red text.

littlecranky67|5 months ago

I am confused as a lot of comments here seem to argue around gaming, but isn't this supposed to be a workstation card, hence not intended to be used for games? The phoronix review also seems to only focus on computing usage, not gaming.

wink|5 months ago

I really wonder who this is for?

It's not competing with amd/nvidia at twice the price on terms of performance, but it's also too expensive for a cheap gaming rig. And then there are people who are happy with integrated graphics.

Maybe I'm just lacking imagination here, I don't do anything fancy on my work and couch laptops and I have a proper gaming PC.

ryukoposting|5 months ago

Last time I had anything to do with the low-mid range pro GPU world, the use case was 3D CAD and certain animation tasks. That was ~10 years ago, though.

numpad0|5 months ago

CAD, and medical were always the use case for high end workstations and professional GPUs. Companies designing jets and cars need more than iGPU, but they prefer slim desktops and something distanced from games.

sznio|5 months ago

Accelerated AV1 encoding for a home server.

hackerfoo|5 months ago

I’m interested in putting one of these in a server because of the relatively low power usage and compact size.

topspin|5 months ago

An obvious use case is high-end NVRs. Low power, ample GPU for object detection/tracking, ample encoders for streaming. Should make a good surveillance platform.

With SR-IOV* there is a low cost path for GPU in virtual machines. Until now this has (mostly) been a feature exclusive to costly "enterprise" GPUs. Combine that with the good encoders and some VDI software and you have VM hosted GPU accelerated 3D graphics to remote displays. There are many business use cases for this, and no small number of "home lab" use cases as well.

Linux is a first class citizen with Intel's display products, and B50/60 is no different, so it's a nice choice when you want a GPU accelerated Linux desktop with minimum BS. Given the low cost and power, it could find its way into Steam consoles as well.

Finally, Intel is the scrappy competitor in this space: they are being very liberal with third parties and their designs, unlike the incumbents. We're already seeing this with Maxsun and others.

* Intel has promised this for B50/60 in Q4

ytch|5 months ago

Another advantage of Intel GPU is vGPU SR-IOV, while consumer video cards of NVIDIA and AMD didn't support it. But even the integrated GPU of N100, N97 support it[1],

Therefore I can install Proxmox VE and run multiple VMs, assigning a vGPU to each of them a for video transcoding (IPCam NVR), AI and other applications.

https://github.com/Upinel/PVE-Intel-vGPU

dev1ycan|5 months ago

I really hope Intel continues with GPUs or the GPU market is doomed until China catches up, Nvidia produces good products with great software, best in industry really, with great length support, but that doesn't excuse them from monopolistic practices. The fact that AMD refuses to compete really makes it look like this entire thing is organized from the top (US government).

This reminds me a lot of the LLM craze and how they wanted to charge so much for simple usage at the start until China released deepseek. Ideally we shouldn't rely on China but do we have a choice? the entire US economy has become reliant on monopolies to keep their insanely high stock prices and profit margins

donkeybeer|5 months ago

Do you think family relations between the ceos is a factor or thats not rlevant?

jaggs|5 months ago

How does it compare to an RTX 5060 ti with 16 gigabytes of VRAM?

Tepix|5 months ago

If you buy Intel Arc cards for their competitive video encoding/decoding capabilities, it appears that all of them are still capped at 8 parallel streams. The "B" series have more headroom at high resolutions and bitrates, on the other hand some "A" series cards need only a single PCIe slot so you can stick more of them into a single server.

kev009|5 months ago

Is there a way to get acceptable performance out of these without resizable BAR now? To retromod older business desktops.

bfrog|5 months ago

I'm glad Intel is continuing to make GPUs, really. But ultimately it seems like an uphill battle against a very entrenched monopoly with a software and community moat that was built up over nearly 20 years at this point. I wonder what it will take to break through.

mixmastamyk|5 months ago

Compact? Looks about a foot long and two slots wide. Not abnormal but not what I’d call compact either.

daemonologist|5 months ago

It's half-height (fits in "slim" desktops, those media center PCs, and in a 2U server without having to turn it sideways/use a riser), and barely longer than the PCIe socket. Phoronix has a picture with a full-height bracket which maybe gives a better point of comparison: https://www.phoronix.com/review/intel-arc-pro-b50-linux

(A half-height single-slot card would be even smaller, but those are vanishingly rare these days. This is pretty much as small as GPUs get unless you're looking more for a "video adapter" than a GPU.)

lanthade|5 months ago

Agreed. I have an A40 GPU in an epyc system right now specifically because it's a single slot card. I did not pay for gobs of PCIE expansion in this system just to block slots with double wide GPUs. Sure it can't do the heavy lift of some beefier cards but there is a need for single space cards still.

Tepix|5 months ago

When will we see Intel Flex datacenter cards that do not have the 8 stream limit based on the Xe2 "battlemage" architecture?

All current Intel Flex cards seem to be based on the previous gen "Xe".

shrubble|5 months ago

With the power being 70W from the connector only, how feasible is it to have 3 per server and have effectively 48GB VRAM for tasks?

adgjlsfhk1|5 months ago

You're probably better off with the incoming B60 which has 24GB vram.

addisonj|5 months ago

I think the answer to that right now is highly workload dependent. From what I have seen, it is improving rapidly, but still very early days for the software stack compared to Nvidia

pshirshov|5 months ago

> 16 GB of GDDR6 VRAM

I would happily buy 96 Gb for $3490, but this makes very little sense.

jacquesm|5 months ago

How is the software support side for AI work with this card?

samspenc|5 months ago

Not bad with 16 GB VRAM, a bit disappointing on performance though, looking at the Blender 3D (open source) benchmarks: https://opendata.blender.org/benchmarks/query/?compute_type=...

It clocks in at 1503.4 samples per second, behind the NVidia RTX 2060 (1590.93 samples / sec, released Jan 2019), AMD Radeon RX 6750 XT (1539, May 2022), and Apple M3 Pro GPU 14 cores (1651.85, Oct 2023).

Note that this perf comparison is just ray-tracing rendering, useful for games, but might give some clarity on performance comparisons with its competition.

adgjlsfhk1|5 months ago

It wouldn't surprise me if there was 10-20% perf improvement in drivers/software for this. Intel's architecture is pretty new and nothing is optimized for it yet.

bee_rider|5 months ago

Is that a comparison of the raytracing fixed function hardware for the various GPUs, or is it a GPGPU comparison?

Ekaros|5 months ago

I think note there against 2060 and 6750 XT is power efficiency. About half or quarter of TDP...

sharts|5 months ago

The upside with this is support for SR-IOV

arresin|5 months ago

> 224 GB/s of effective bandwidth

solardev|5 months ago

Will Intel even be around in a few years to support this thing, especially in software? They seem to be in their death spasms...

altcognito|5 months ago

Intel is doing poorly, but I believe Apple was in much, much worse shape than this in the early 2000's. AMD was also in much, much worse shape that this.

Intel has many, many solid customers at the government, enterprise and consumer levels.

They will be around.

nottorp|5 months ago

... but how's video game compatibility with Intel these days?

I have this cool and quiet fetish so 70 W is making me extremely interested. IF it also works as a gaming GPU.

DrNosferatu|5 months ago

Now launch models with 32GB and 64GB VRAM for those fat LLMs.

mananaysiempre|5 months ago

A $350 “workstation” GPU with 16 GB of VRAM? I... guess, but is that really enough for the kinds of things that would have you looking for workstation-level GPUs in the first place?

adgjlsfhk1|5 months ago

The closest comparable on the Nvidia side is the RTX Pro 2000 which is 16GB vram for $650 (and likely more compute).