One of the wild things to me is how incredibly elaborate chipmaking as become over the years. Per Wikipedia, the 6502's layout was made with "a very manual process done with color pencils and vellum paper". That's the processor that launched the personal computing revolution, powering the Apple II, Commodore PET and VIC-20, the Acorn, and the BBC Micro. Nearly 50 years later, things have gotten so fiendishly complex that "flip it over and put some stuff on the back" is a major industry change requiring who knows how many billions in R&D.
Does Backside Power Delivery mean the silicon area of the chip will be placed upside down on the motherboard to ensure efficient heat dissipation? (Since the power source will be on the opposite side compared to current day silicon)
Relevant quote from TFA which somewhat answers the above question:
> Of note, because the carrier wafer is on the signal side of the chip, this means it presents another layer of material between the transistors and the cooler. Intel’s techniques to improve heat transfer take this into account, but for PC enthusiasts accustomed to transistors at the top of their chip, this is going to be a significant change.
Backside power delivery should help portables and maybe get some of these crazy desktop numbers down. If all goes as planned Intel looks to have a 2 year jump on TSMC based on current timetables. But I'm worried about them adding nanosheet transistors around the same time. They got into problems before being too ambitious and having a stalled roadmap for a very long time.
Assuming they deliver on a timetable they say they will, which intel has had a well established track record of not doing. With intel, over the last 5-10 years I don't believe anything they say until there are chips in peoples hands at a volume that is meaningful.
Reading stuff like this makes some part of me really happy- I'll probably never even get into this part of the industry, but reading about breakthroughs like this always just feel really amazing
I worked at Intel for 3 years and I felt similarly. I was in a group writing software to help the R&D physicists figure out the kinks of the low-level devices (next-gen transistors) way before going to silicon. The day-to-day work was relatively mundane, but talking to these people about their problems and needs made me feel like I was doing my little bit to move humankind forward.
Heh, I remember when this was the norm: Samsung, TSMC, and GloFo waiting for Intel to work out the kinks and tooling before implementing it themselves.
To be fair, most of that research happens pre-competitively (cfr imec roadmap). That includes working with tool makers to work out the kinks. Of course there is a big hurdle from there to an actual product-scale deployment.
Would it be the case that a lower grade or even waste silicon blanks can be used to do the power component?
Can a lower level of 'scale' be used, to achieve the power lines, and so more assured/fault-tolerant production?
If it has to lock-step the generational burdens of masking and technology, and is unable to capitalise otherwise unused silicon, it doubles costs in those inputs. Not that it doesn't mean it can't deliver both improved power budget and better signals (less interference) but it might even be capable of becoming a lower cost option, if the bonding/positioning/lapping processes aren't too expensive, and its input costs and consequences for wiring plane costs are better than the alternatives to boot!
Wafers are cheap. Bonding is not. And grinding is expensive. (Just ask Harris and Burr-Brown, who introduced this step ages ago for their processes requiring dielectric isolation.) So if that's cheaper than a few EUV masks, that says a lot about EUV!
Apparently chip-making uses (or used?) Chlorine trifluoride - the stuff that can set asbestos or sand on fire - to clean chemical vapour deposition chambers.
Intel, just like Boeing and to a lesser extent many other F500 companies appear to be fully financialized at this point. Their executives probably wake up each day and think about how much stock to buy back and nothing else.
Do they even remember what it is the company used to do for a living?
What does "start leading" even mean? Every CPU launch, AMD is ahead and the Intel beats them or Intel is ahead and then AMD beats them. There is no leading here. There are only wins for people who want CPUs that keep getting faster, cooler, require fewer watts to run, and overall improve generation upon generation.
BSI requires avoiding having wiring on both sides, so that there's a side where no wiring will interfere with incoming light. But it does at least share the idea of polishing down the die so the layer of bulk silicon under the transistors is very thin.
It looks like Intel still has a solid lead in single core perf, which is frankly the biggest factor for me for a general purpose desktop CPU. Of course, other uses have other priorities. The charts are missing one important measure, power efficiency.
Power Via is more of a fab technology and AMD is fabless. Therefore not really possible to compare directly in this aspect. AMD relies on TSMC or Samsung to make their chips.
[+] [-] rektide|2 years ago|reply
[+] [-] wpietri|2 years ago|reply
One of the wild things to me is how incredibly elaborate chipmaking as become over the years. Per Wikipedia, the 6502's layout was made with "a very manual process done with color pencils and vellum paper". That's the processor that launched the personal computing revolution, powering the Apple II, Commodore PET and VIC-20, the Acorn, and the BBC Micro. Nearly 50 years later, things have gotten so fiendishly complex that "flip it over and put some stuff on the back" is a major industry change requiring who knows how many billions in R&D.
[+] [-] metadat|2 years ago|reply
P.s. in case you never heard of BPD technology, https://semiengineering.com/challenges-in-backside-power-del... is also informative (thanks @rektide for the Anand link!)
Edit:
Relevant quote from TFA which somewhat answers the above question:
> Of note, because the carrier wafer is on the signal side of the chip, this means it presents another layer of material between the transistors and the cooler. Intel’s techniques to improve heat transfer take this into account, but for PC enthusiasts accustomed to transistors at the top of their chip, this is going to be a significant change.
TL;DR: No.
[+] [-] pyrolistical|2 years ago|reply
[+] [-] jmisavage|2 years ago|reply
[+] [-] genmud|2 years ago|reply
[+] [-] marricks|2 years ago|reply
This is may be to there benefit, as it gives them another 5 years to say “oh we’ll be ahead soon” without actually releasing anything.
[+] [-] HeWhoLurksLate|2 years ago|reply
[+] [-] ihaveajob|2 years ago|reply
[+] [-] brucethemoose2|2 years ago|reply
[+] [-] yvdriess|2 years ago|reply
[+] [-] robertlagrant|2 years ago|reply
[+] [-] baybal2|2 years ago|reply
[deleted]
[+] [-] ggm|2 years ago|reply
Can a lower level of 'scale' be used, to achieve the power lines, and so more assured/fault-tolerant production?
If it has to lock-step the generational burdens of masking and technology, and is unable to capitalise otherwise unused silicon, it doubles costs in those inputs. Not that it doesn't mean it can't deliver both improved power budget and better signals (less interference) but it might even be capable of becoming a lower cost option, if the bonding/positioning/lapping processes aren't too expensive, and its input costs and consequences for wiring plane costs are better than the alternatives to boot!
[+] [-] exmadscientist|2 years ago|reply
[+] [-] vpribish|2 years ago|reply
[+] [-] dtx1|2 years ago|reply
[+] [-] sillywalk|2 years ago|reply
[+] [-] mk_stjames|2 years ago|reply
[+] [-] steve76|2 years ago|reply
[deleted]
[+] [-] civilitty|2 years ago|reply
[+] [-] hn8305823|2 years ago|reply
Do they even remember what it is the company used to do for a living?
[+] [-] TheRealPomax|2 years ago|reply
[+] [-] boringuser2|2 years ago|reply
[+] [-] t2i34uo234234|2 years ago|reply
[+] [-] wtallis|2 years ago|reply
[+] [-] dboreham|2 years ago|reply
[+] [-] mometsi|2 years ago|reply
[+] [-] wincy|2 years ago|reply
I hope Nvidia adds backside power delivery to their new CuLitho chips, just so we can get as many jokes as possible out of this fab cycle.
[+] [-] DavidPiper|2 years ago|reply
Second thought was "I wonder if it reduces fan noise."
[+] [-] dmvdoug|2 years ago|reply
[+] [-] ftxbro|2 years ago|reply
I thought maybe it was only the ones in the article who were saying it, but no it's also in the intel advertisement video https://www.youtube.com/watch?v=Rt-7c9Wgnds&t=73s
[+] [-] grandinj|2 years ago|reply
So just build a deeper stack of layers, rather than flipping and grinding.
But probably there is a reason.
[+] [-] dale_glass|2 years ago|reply
Though I'd be curious to see what the power delivery layers look like after this change.
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] noipv4|2 years ago|reply
[+] [-] melling|2 years ago|reply
AMD has been doing extremely well while Intel hasn’t been able to regroup.
[+] [-] handsclean|2 years ago|reply
It looks like Intel still has a solid lead in single core perf, which is frankly the biggest factor for me for a general purpose desktop CPU. Of course, other uses have other priorities. The charts are missing one important measure, power efficiency.
[+] [-] 55873445216111|2 years ago|reply
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] junon|2 years ago|reply
[+] [-] azilodf|2 years ago|reply
[+] [-] social_ism|2 years ago|reply
[deleted]
[+] [-] tomcam|2 years ago|reply
[deleted]