top | item 45127043

(no title)

MrFoof | 5 months ago

.

discuss

order

rcxdude|5 months ago

>5GW consumption I assume is per day?

GW is already a measure of rate of energy use. You can talk about GWh per day (which is really just another way of saying 0.041 GW), but GW per day is only sensible in the context of a ramp in power consumption.

Assuming the original source of that number didn't incorrectly conflate GWh into GW, that puts your comparison as underestimating the energy usage 24-fold.

matt-p|5 months ago

I've heard 5GW touted as the capacity (so like 120GWH per day if flat loaded at 100%). However that is clearly B.S as I've also heard 2GW and another much more sensible number

Just to give perspective 5GW would power about 10 Million H100s or about 5 million G200. Completely farcical, won't happen.

jeffbee|5 months ago

Total rewrite of my previous reply:

There's a 7-million-sqft data center near Reno that claims 650MW capacity. Zuck says his will be 4 million sqft. So that doesn't really get us to 5GW unless we're going vertical.

dist-epoch|5 months ago

> Yet increasing the power consumption of an area by 25X over a few years requires a significant amount of transmission and distribution infrastructure upgrades to make that happen

Not if you put your data-center right next to the power station.

ben_w|5 months ago

> The 5GW consumption I assume is per day? If so, that’s on par with 170,000 average homes… being put in a parish with around 7250 households.

Watts are Joules per second, you'd only measure "watts per day" as a rate of change of power supply, not absolute power supply.

Also, 5GW / 170000 homes ≈ 29kW/home, even the USA isn't that heavily supplied with electricity.

(My home in Germany is very efficient, 0.5 kW average, including heating and cooling as well as all devices).