top | item 44047442

(no title)

newtonsmethod | 9 months ago

The energy use by AI probably is just as, if not more, carbon intensive, but the article never says that. It talks about the energy use of the general data center.

> The carbon intensity of electricity used by data centers was 48% higher than the US average.

discuss

order

AnthonyMouse|9 months ago

In case anyone is wondering why that is, it's because they put data centers in the places with the cheapest electricity. Which, in the US, is in places like Virginia and Ohio, where they burn fossil fuels.

If the people always talking about how cheap solar is want to fix this, find a way to make that cheapness actually make it into the customer's electric bill.

ipdashc|9 months ago

I've always wondered why data centers aren't taking off more in places like Iceland (cheap geothermal) or Quebec (cheap hydro). Both of these places are also pretty cold and one would think this benefits cooling.

There are periodically news articles and such about data centers in Iceland, of course, but I get the impression it's mostly a fad, and the real build-outs are still in Northern Virginia as they've always been.

The typical answer I've seen is that Internet access and low latency matter more than cooling and power, but LLMs seem like they wouldn't care about that. I mean, you're literally interacting with them over text, and there's already plenty of latency - a few extra ms shouldn't matter?

I'd assume construction costs and costs of shipping in equipment also play a role, but Iceland and Canada aren't that far away.