top | item 38724327

Intel plans spinoff of FPGA unit

139 points| ChuckMcM | 2 years ago |networkworld.com | reply

161 comments

order
[+] ChuckMcM|2 years ago|reply
Apparently they announced this in October. They are kicking Altera, which they acquired for 16.7 billion back out into the world to live or die on its own. Which feels a bit weird given what feels like AMD's relative success with Xilinx.

It also dampens their "US developed" technology pitch (which they had been pushing pretty hard vs other solutions that were fabbed at TSMC) I wonder if they will also give up on their third party use of their fabs.

Intel was the first "real" job I ever had and it was during Andy Grove's tenure as CEO and his "paranoid" attitude was really pervasive. I had been working on graphics chip which were the next big thing until the chip recession and then they weren't. And it followed a long series of Intel sending out tendrils outside its core microcomputer (eventually x86/x51 only) winners only to snap them back at the first sign of challenges.

I wonder if we can get better open source tool support from the new entity. That would be a win.

[+] AceJohnny2|2 years ago|reply
> only to snap them back at the first sign of challenges.

It's really quite impressive how bad Intel's track record has been. Would you say it's a broad cultural problem within Intel? Bad incentives?

(I'b be interested if someone could come up with a list of all of Intel's failed ventures. StrongARM, Optane, Altera... Obviously some of these are par for the course for such a large company as Intel, but Intel still seems to stand out)

(I know one reason Apple is so secretive about its ventures is because it knows many of them won't pan out, and it knows that due to its size anything it does has huge repercussions, and wants to avoid that)

[+] Thimothy|2 years ago|reply
When Intel acquired Altera, they scrapped the old catalog of Altera FPGAs. A few corps I was working with, who had heavy lock in to Altera due to those FPGAs, got royally screwed (had to suddenly redesign, and re-certify a bunch of boards that they where selling to happy costumers). Of course they went with Xilinx at the redesign, because f*ck you, and it didn't hurt that the SoC offerings of Xilinx where very superior at the time. I don't think it was a cost reduction issue, Altera was selling those old FPGAs at a BIG markup.

The amount of market I saw dissolve for Altera in a puff was incredible. To this day one of the most shortsighted corp decisions I have ever witnessed.

[+] AceJohnny2|2 years ago|reply
> I wonder if we can get better open source tool support from the new entity. That would be a win.

Tragically, I wouldn't bet on it. In fact, I'd be shocked if that were to happen.

[+] raverbashing|2 years ago|reply
I think Intel is living their Nokia moment, except their lunch will be eaten much more slowly by the competition
[+] 7speter|2 years ago|reply
I don’t know I don’t have the credentials to be an engineer at Intel, but I think they know its a bad idea to end or spin off their foundry service, especially now that they have a new client that will know the ins and outs of their fabrication process.
[+] burnte|2 years ago|reply
They do this so often, buy a company, sell it a few years later. So weird.
[+] UncleOxidant|2 years ago|reply
> I wonder if we can get better open source tool support from the new entity. That would be a win.

Unfortunately, this seems unlikely. The FPGA vendor tools are a complete shit show and open source tools would be greatly welcomed - they're doing really well in the Lattice/ICE40 space but those are small FPGAs. But Altera and Xilinx don't seem at all inclined to encourage the development of open source alternatives.

[+] wslh|2 years ago|reply
Side question, how do you see the future of Intel? Do you share the doom perspective or think that they continue to have opportunities ot catch up? Thanks!
[+] jonnycoder|2 years ago|reply
Every year Intel announces a new failure, and every year I feel more ashamed for having Intel as the bulk of my software engineering experience on my resume. I saw the signs when I was still there and attending quarterly updates. It felt like in one breath they admitted to missing the mobile boat and in another they said only gamers need dedicated/discrete powerful gpus.
[+] 1-6|2 years ago|reply
This was actually a bright-spot in Intel's lineup that had a chance to be a moonshot. I don't know why Intel is giving up this early especially when AMD acquired Xilinx recently. Is intel trying to emulate Nvidia? I don't think they should try to become another Nvidia. They should be building programmable AI chips with FPGA. Heck, LLMs are able to code Verilog. There are many many possibilities.
[+] contrarian1234|2 years ago|reply
They acquired Altera in 2015, which in the tech world is when the dinosaurs roamed the earth. How much more time would they need to see a profitable synergy? Bearing in mind they can probably see what's in the pipeline for the next couple of years.

I'm not an expert... But in my naiive opinion it seems entirely reasonable to expect an acquisition to start paying off within ten years?

Would love to hear counterexamples

[+] adrr|2 years ago|reply
Because intel messes up everything with their bureaucracy. Look at the NUC, they made it extremely hard to get because of their convoluted distribution strategy. They could have easily done direct to consumer and made it easy to get.

Till they fix their culture problem, they should be unloading these business units and profit by holding majority stakes in them.

[+] MBCook|2 years ago|reply
They gave up on Xscale right before mobile demand shot to the moon.
[+] tester756|2 years ago|reply
Why "giving up"?

It's just spinoff and their people are at executive levels

[+] brucethemoose2|2 years ago|reply
> They should be building programmable AI chips with FPGA

Eh, people have been saying this for over a decade, and I think that opportunity has passed.

FPGAs are not going to beat GPUs anytime soon due to the software ecosystem (among other things), and ultimately they are not going to outrun ASICs that are now economically viable (especially in the embedded space).

[+] trynumber9|2 years ago|reply
Intel bought Altera in 2015 when it still thought 10nm would be on time and an advanced node. That did not work out. The idea was to get a better FPGA and have more customers to justify fab build out expenses. Gelsinger more recently said he does not want to force products to be on Intel fabrication. Use Intel processes where it makes sense. No reason to push Altera FPGA to Intel 10/7/4. No reason to push NICs to Intel 10/7/4. And so on.
[+] blackguardx|2 years ago|reply
This is pretty funny. Intel bought Altera, which forced AMD to buy Xilinx with all the zero interest rate money floating around. AMD's purchase of Xilinx made less sense because AMD is fabless, but Intel didn't end up doing anything with Altera. Its not clear if Altera even started using Intel fabs for its chips. AMD's Xilinx has been comparatively more successful, but I don't think that had anything to do with AMD.

Maybe we can look forward to all the ZIRP semiconductor consolidations to unwind.

[+] kjs3|2 years ago|reply
AMD's purchase of Xilinx made less sense because AMD is fabless

Xilinx was fabless before the acquisition. I'm missing how that made less sense.

with all the zero interest rate money floating around

Hehe...as one of my finance-world pals said: "everyone's doing M&As like drunken sailors".

[+] pclmulqdq|2 years ago|reply
The only "synergy" that has come from AMD-Xilinx is that AMD took a (relatively simple) DSP for machine learning that Xilinx had built and put it into their newer CPU lines. That's still better than Intel-Altera, which basically didn't integrate at all, despite having grandiose plans.
[+] Tuna-Fish|2 years ago|reply
It's funnier than you think.

Intel marketing built the market of x86 server + tightly integrated FPGA. Large customers built products based on their promises. Then they didn't deliver.

Then AMD bought Xilinx and started shipping to the market.

[+] imtringued|2 years ago|reply
The irony is that Xilinx is better at AI than AMD.
[+] 15155|2 years ago|reply
AMD and Xilinx both use newer TSMC nodes?
[+] JoshTriplett|2 years ago|reply
This is sad news; I was hoping one day we'd see chips with substantial on-die FPGA fabrics, ideally in ways that we could program with open tools. This announcement makes that less likely.
[+] rwmj|2 years ago|reply
Red Hat supported a lot of research into this and there's some really interesting stuff, but nothing that is very compelling for commercial use. What uses do you have in mind?

To my mind the more interesting stuff are the PCIe FPGA boards like https://www.xilinx.com/products/boards-and-kits/alveo/u200.h...

One particularly interesting research project was using the FPGA fabric to remap addresses, allowing database tables to be "virtually" rearranged (eg. making a row-major data source into a column-major source for easier searching). https://disc.bu.edu/papers/edbt23-relational-memory https://arxiv.org/pdf/2109.14349.pdf

[+] Laaas|2 years ago|reply
The new Ryzen chips already have this as "XDNA" AFAIU
[+] lawlessone|2 years ago|reply
Ten years from now..

"Intel should have dominated this space but Xilinix etc got lucky"

[+] vachina|2 years ago|reply
So what’re they gonna name it this time? Altera? Hope they kept the original documentations.
[+] shrewm|2 years ago|reply
I'm thinking Ctrlera™.
[+] midtake|2 years ago|reply
I don't know if this is the common perception and I'm definitely a bit biased, but it seems like a bad sign for Intel. Without a paradigm shift I don't foresee anything improving for them. Their sales team must be insane though. I feel a bit "good riddance" about it too and wish ARM or RISC would become the standard for gaming and productivity (read: rendering, compiling, not Microsoft Word) PCs.
[+] sheepybloke|2 years ago|reply
Maybe I'm too stuck in my own sector, but I haven't seen an Altera FPGA in the wild in forever. All the Altera FPGA's I've seen are over 10 years old, and anything newer is a Xilinx FPGA. Granted, my field has moved purely to MPSoC's, but it's crazy to me that I've never seen a Altera FPGA even in the conversation.
[+] jasoneckert|2 years ago|reply
The first thing this reminded me of was when Intel got rid of StrongARM/XScale because they didn't think it would amount to much in the long run. Hopefully they don't regret this particular spinoff in the future.
[+] reachableceo|2 years ago|reply
One would presume Intel will get a decent chunk of the stock in any IPO and capture the upside value.

That does seem to be how these kind of deals are usually structured. Spinco is 60% owned by the parent or wherever.

[+] Kon-Peki|2 years ago|reply
I take exception to the usage of the word "spinoff". Intel is selling a portion of Altera. If this was a true spinoff, Intel shareholders would get shares in the new entity.

Intel needs the cash, so this is understandable.

[+] soulbadguy|2 years ago|reply
Large acquisition rarely seems to pan out well in the tech sector. Especially when big companies try to acquire their way into an adjacent market.

Also some company seems to be significantly worst than other, MSFT/microsoft/Dell come to mind. My suspicion is those type of acquisition are mainly driven by C/executive level employee as way to hide the real struggle of the company.

Is there a report analyzing bit tech acquisition say for the last 30 years, and they economical impact ? That would be an interesting read.

Maybe it's time for a new form of regulation around acquisitions

[+] somethoughts|2 years ago|reply
It'd be interesting if some of the funds from the sale will be used for AI software development to provide a better coordinated response to CUDA.
[+] mardifoufs|2 years ago|reply
How's the FPGA market at the moment? Has Altera been able to keep up with Xilinx (or vice versa) under Intel ownership?
[+] 15155|2 years ago|reply
Xilinx's toolchain and chip offerings are substantially better.
[+] bfrog|2 years ago|reply
Maybe they will finally make vhdl 2008 a part of the lite edition and stop knee capping it. Maybe create some updated parts that better fit the middle price range area instead of solely focusing on mega agilex parts.

Their competitors don't seem to do this sort of thing.

[+] Aromasin|2 years ago|reply
Look up Agilex 5 and 3. Announced some time last year/early this year. Low to mid range devices. Early access customers are working on designs for 5 now.
[+] 1-6|2 years ago|reply
I hope Nvidia comes along and scoops up Altera. That will show Intel a lesson.
[+] ak217|2 years ago|reply
Yeah, it's interesting to compare Nvidia's strategy to Intel's. I'm sure there are quite a few Nvidia projects that have been cancelled or even acquisitions liquidated, but they all seem to be small. Every significant part of Nvidia that I can remember is something they are committed to, sometimes over multiple decades, even when the market is not there and sales are near zero. This seems to come from actually having a consistent, stable long-term vision and buy-in all the way up to Jensen Huang serving as a driving force behind acquisitions and projects, unlike Intel where the driving force seems to be bean counting and market domination related.

To give credit to Pat Gelsinger, his stated goal is to shed non-essential units and refocus on the fundamentals. But I'm not sure how well that's going.

[+] dboreham|2 years ago|reply
FPGAs have never made sense. They're way too expensive to use in volume. There's no practical use case for "cool, I can reprogram the chip in the field to implement different functionality". Nobody has figured out how to usefully integrate them with a CPU to make a low-volume SOC. CPUs became so fast that most applications don't need customer hardware. Regular gate arrays are cheaper and faster above minimal volume.

They seem to only have been useful for prototyping and military applications (low volume and infinite budget).

[+] vatys|2 years ago|reply
I see them used in pro/prosumer audio equipment, synthesizers, and effects, which is relatively low volume and medium-to-high budget. FPGAs (and CPLDs, µC+AFE, etc) are great for these applications because they have great capabilities you might otherwise need a pile of discrete components or a custom chip for, but it doesn’t make sense to design fully custom silicon if you’re only ever going to sell about 50-500 of something.

So sure, prototyping and military, but there are other uses as well. But none of them are super high-volume because once you’re selling millions of something you should be designing your own chips.

[+] Bluebirt|2 years ago|reply
Consumer application and FPGAs are an oxymoron in itself. FPGAs are used in applications requiring special interfaces, special computing units or other custom requirements. If there is enough demand, SoCs are developed for these applications, but this is only useful in mid to high volume production. Areas like the ones you gave and many more are making heavy use of FPGAs. I work in medical for example. We are using custom designed chips for special detection purposes. But when it comes to data processing and interfacing with computers, we use FPGAs.
[+] aleph_minus_one|2 years ago|reply
> CPUs became so fast that most applications don't need customer hardware.

When complicated realtime signal processing is to be done, FPGAs shine - in particular if there exists no DSP that is competitive for the task.

[+] imtringued|2 years ago|reply
You somehow managed to write a post where every single sentence is absolutely wrong. FPGAs clearly make sense for prototyping ASICs. That alone makes FPGAs make sense even if it is a tiny niche market. After all, the budget of ASIC companies is big. A few hundred FPGAs for developers are a drop in the bucket compared to the cost of an ASIC.

Too expensive in volume only applies to Xilinx and Altera and with every node shrink, the amount of designs that fit on an FPGA grow while the non recurrent development costs grow for ASICs. Due to this, the maximum volume at which FPGAs remain cost competitive keeps growing with every generation.

Smart NICs make extensive use of reconfiguration because things such as protocols are not set in stone. They can change all the time. It is also possible to build designs that make extensive use of partial reconfiguration.

MPSoCs have been a thing for a long time. If you want to get into those Google Kria 260 and you will be pleasantly surprised.

When I take a look at Effinix FPGAs those are designed specifically for vision applications and massive amounts of I/O. A CPU would struggle with multiple camera streams and consume too much power.

Yeah ok but an 100k LUT FPGA Chip from Effinix costs what? 25€? You're going to need a really high volume to get an overall cost saving. 40000 FPGAs is only a million dollars. The masks of an 22nm ASIC cost 1.5 million dollars without the rest of the development costs.

And finally your last sentence contradicts the first. Next time stick with a story.

[+] crotchfire|2 years ago|reply
The problem is that FPGA companies are really CAD tool companies who see their chips as copy-protection/payment-assurance schemes for their software.

Unfortunately their CAD tools suck, but that's beside the point.