top | item 44497332

(no title)

shawndrost | 7 months ago

The below DOE link substantiates my quotes about US data centers and power usage, which I'll reproduce here. "[D]ata centers consumed about 4.4% of total U.S. electricity in 2023 [and ~1.5% in 2014]." "[T]otal [US] data center electricity usage climbed from 58 TWh in 2014 to 176 TWh in 2023 and estimates an increase between 325 to 580 TWh by 2028 [which would be circa 20% YOY growth]."

These quotes are ~compatible with your 4.4TWh number. If you still think the below DOE link is wrong and "...exceeds 2023 estimates for all data center usage globally" could you share why you believe that?

(Note that my "extremely wrong" is not directed at the literal text "LLM inference has a carbon impact of, like, a couple Google searches" but with the implication that LLMs have negligible carbon impact. If you think DCs were using 4.4% of US power in 2023 and growing at 20% YOY, and are a sizable-and-fast-growing carbon impact -- but that one LLM call is a small carbon impact -- I'll concede the latter and soften "extremely wrong" to "your original comment carried implications you didn't want".)

https://www.energy.gov/articles/doe-releases-new-report-eval...

discuss

order

tptacek|7 months ago

The 58->176 TWh interval from 2014 to 2013 clearly wasn't driven by LLMs; ChatGPT wasn't released until 2022. There were of course AI/ML models that preceded it, but nothing used at the scale LLMs are now. If your whole case is that technology writ large is driving data center expansion, that's fine; my argument is simply that it doesn't make sense to single out LLMs.

I think at this point though we understand the contours of our respective arguments! We don't have to keep litigating. Thanks for this!

shawndrost|7 months ago

My argument is really not about the 58->176 transition (which was slower than 20% YOY) but the rapid datacenter deployment that started around 2022. It is basically all LLMs (McKinsey says ~75% IIRC).

Anyway, yeah, thanks for the exchange!