top | item 36937354

(no title)

lwneal | 2 years ago

An H100 uses up to 350 Watts, while an A16 has a TDP of only 8 W. But, the A16 is a smaller chip (about 108mm vs. the H100's 814mm) so you can fit more of them on a wafer. Since a wafer is 300mm in diameter, its area is 70685 mm^2, which would yield 86 H100's or 654 A16's. [1][2]

However, that discounts the waste on the edges of the circular wafer, as well as the chip yield, which will both likely be worse for the larger chip [3]. But, assuming a generous 70% yield by area [4], one wafer's worth of H100s all packaged into GPUs and running full blast will use maybe 20 kilowatts, while the same wafer of A16s might use 3.6 kilowatts. Although in practice, the A16s will spend most of their time conserving battery power in your pocket, and even the H100s will spend some of their time idle.

TSMC is now producing over 14 million wafers per year. At most 1.2 million of those are on the 3nm node, and not all of that production goes to GPUs. But as an upper bound, if we imagine that all of TSMC's wafers could be filled up with nothing but H100 chips, and if all of those H100 chips were immediately put to use running AI 24/7, how much additional load could it put on the power grid every year?

The answer is, around 280 gigawatts, or if they were running 24/7 for a year, about 2500 terawatt-hours. That's about 10% of current world electricity consumption! So it's not completely implausible to imagine that a huge ramp-up in AI usage might have an effect on the electric grid.

*edit: This assumes we're talking about the Apple A16 (ie. the difference between phone chips and GPU chips). If we're talking about the Nvidia A16 (ie. the difference between current GPU chips and last node's GPU chips) see pclmulqdq's comment. ⠀

[1] https://nanoreview.net/en/soc/apple-a16-bionic

[2] https://www.techpowerup.com/gpu-specs/h100-pcie-80-gb.c3899

[3] https://news.ycombinator.com/item?id=24185108

[4] https://www.extremetech.com/computing/analyst-tsmc-hitting-5...

[5] https://www.tsmc.com/english/dedicatedFoundry/manufacturing/...

[6] https://www.wolframalpha.com/input?i=%2814+million%29+*+%282...*

discuss

order

pclmulqdq|2 years ago

8 watts for the A16's TDP cannot be correct. Your phone CPU has a higher TDP. I saw 250 on Nvidia's website as a maximum.

Edit: Oh, you are talking about the Apple A16. Those chips are completely different in function, so sure.

adrian_b|2 years ago

6 to 8 W is a typical TDP for a mobile phone SoC including the CPU.

A few mobile phone chips had a higher TDP, up to 10 W, but those were notorious for overheating and for low battery life.

blackoil|2 years ago

> At most 1.2 million of those are on the 3nm node

1.2 x 30 x 30000($/board) ~ 1 trillion $$$. Time for NVDA call.