>>xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines.
These generators polluted the nearby historically black neighborhoods in Memphis Tennessee with nitrogen oxides. Residents are afraid to open their windows, with the elderly, children and those suffering from conditions like COPD particularly affected. Lawsuits alleging environmental racism are pending.
xAI says cleaner generators will be installed but I think this episode shows that we cannot allow public interests to be compromised by private sector so easily just because they scream: Jobs! Investment!
> xAI says cleaner generators will be installed but I think this episode shows that we cannot allow public interests to be compromised by private sector so easily just because they scream: Jobs! Investment!
80ish% in the US live <100 miles from their hometown.
It would be wise to see "jobs!" Investment!" as little more than a mafioso like threat to agrarian-stay in one place-work to live types. "Sure is a nice Shire you got there. Better hope it doesn't suffer from lack of investment in jobs."
Threats of it all imploding are taken seriously by a lot of people.
So what if it does? That's normal with the passage of time. As long as human biology exists humans will solve for those problems. Beyond that obligation is just socialized memes, ethno objects that come and go with the generations.
Everyone alive now worried about propagation of our culture sure does not seem concerned Latin fell out of common use. That they aren't spending their lives keeping old traditions alive should make it obvious old traditions don't mean that much to the living.
Politicians and rich need us servicing debt they so graciously took on to invest in jobs or we would be free to police them.
These "gas turbines" are located next door to the Allen Combined Cycle Plant, a grid scale natural gas power plant with 1.1GW capacity. It's there to power a nearby steel mill. That's the kind of neighborhood xAI has put its cluster in.
I'm incredibly skeptical of any claim that xAI's power use is putting a dent in the local environment, and "environmental racism" just reeks of the usual agenda pushing.
I'm a bit skeptical about this. I know diesel generators make these kind of pollutants, but I haven't heard the same about natural gas.
My city has a big NG facility downtown that pipes heated water to a bunch of buildings, and it is surrounded by condos. I've never heard anything about it impacting the air (other than CO2 which is a global and not local issue).
Every building here (except for those connected to district heating systems), large and small, has a natural gas boiler or furnace. We have also several NG plants generating electricity within city limits. Again, localized pollution is not what concerns people about these things. Coal plants, on the other hand, tended to be way outside the city when they were still in operation.
TFA said it's all legal and explicit federal policy. You don't have to like it, but some people are going to have to make minor sacrifices if the majority want AI services. Look on the bright side, when these people all have personal robot doctors caring for them well into their 100s they will be grateful they didn't listen to the NIMBYs
It’s cute they describe this as a solution to _the_ power problem. It’s a solution to _their_ power problem. We have a grid problem. This massive amount of investment would be an incredible time to do something about it. Instead we’ve got an administration hostile to modern energy solutions and an industry hostile to everyone. Really depressing to see all this money go up in smoke in such a massive short sighted rush.
I previously worked directly for some of the power generation manufacturers listed in the article and later on the grid/power transmission side.
My takeaway is they get it correct enough but no deep insight on the power generation industry.
I was surprised by and learned a few things from the article though. Definitely gives me some ideas of reaching out to old contacts to see if there’s any opportunities with building models and analytics for the new demands.
Focusing on Bloom is fun because they’re new and startup vibes but Innio and cat are really having a resurgence of demand with their generators and building diesel/natg engines is much simpler than gas turbines. I’m sure the heads at GE wish they hadn’t sold that off now.
On steam/gas turbine blade manufacturing there most certainly are more big players than 4 and many US based. You have to remember this is an old industry with existing supply chains and maintenance companies.
As long as the demand for new data centers doesn’t lose steam these onsite options will continue to flourish. Fed grid access builds are currently a 10+ year wait and they are reworking the system to be “fast”, only 5-6 years for build outs now. They’re also changing how the bidding process works which was touched on here. You need skin in the game if you want to be taken seriously now. There’s so many requests from companies arbing who can give them the best deal/timeline. Now you need to put money up if you even want a call back.
so we had some onsite generation moves from the lower end - residential solar, etc - and now we have it from the higher end - fossil fuel generation at datacenters. If that creates high efficiency generators then that may drive "onsite" further into the mid-segment. That may also affect the grid role nudging it from hierarchical delivery to network sharing/rebalancing, and may even lead to separate local grids (like 100+ years ago). That also would give fossil fuels new demand (and also would be a market for small/compact nuclear). Kind of disintegration wave.
Kinda proving that these are a bad deal for communities - very few jobs and tax revenues, but enjoy the increased asthma and cancer we all get to pay for.
Part of what bothers me with AI energy consumption isn't just how wasteful it might be from an ecological perspective, it's how brutally inefficient it is compared to the biological "state of the art" — 2000kcal = 8,368 kJ. 8,368 kJ / 86,400 s = 96.9 W.
So the benchmark is achieving human-like intelligence on a 100W budget. I'd be very curious to see what can be achieved by AI targeting that power budget.
Is it though? When I ask an LLM research questions, it often answers in 20 seconds what it would take me an entire afternoon to figure out with traditional research.
Similarly, I've had times where it wrote me scientific simulation code that would take me 2 days, in around a minute.
Obviously I'm cherry-picking the best examples, but I would guess that overall, the energy usage my LLM queries have required is vastly less than my own biological energy usage if I did the equivalent work on my own. Plus it's not just the energy to run my body -- it's the energy to house me, heat my home, transport my groceries, and so forth. People have way more energy needs than just the kilocalories that fuel them.
If you're using AI productively, I assume it's already much more energy-efficient than the energy footprint of a human for the same amount of work.
How so? A human needs the entire civilisation to be productive at that level. If you take a just the entire US electricity consumption and divide it by its population, you'll get a result that's an order of magnitude higher. And that's just electricity. And that's just domestic consumption, even though US Americans consume tons of foreign-made goods.
Ah! And don't get me started about how specific its energy source must be! Pure electricity, no less! Where a human brain comes attached with an engine that can power it for days on a mere ham sandwich!
Beyond wasteful the linked article can't even remotely be taken seriously.
> An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.
What? I let ChatGPT swag an answer on the revenue forecast and it cited $2-6B rev per GW year.
And then we get this gem...
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
So now we're going to be spewing ~486 g CO₂e per kWh using something that wasn't designed to run 24/7/365 to handle these workloads? These datacenters choosing to use these forms of power should have to secure a local vote showcasing, and being held to, annual measurements of NOx, CO, VOC and PM.
This article just showcases all the horrible bandaids being applied to procure energy in any way possible with little regard to health or environmental impact.
Does anyone know a really good source for basic information estimating what % of global carbon emissions come from AI training and AI inference, both 1) now and 2) in the future if we believe AI companies' capacity projections? I would really like to read a detailed analysis of this avoids both AI hype and anti-AI hysteria. It's an important question but it excites strong reactions that tend to cloud the facts.
Yes, all sources are biased, but some are useful. And I know that it's hard to get solid data on this from AI companies, but we must have at least a rough estimate?
US grid carbon intensity is 0.384 gCO2/kWh (source: ourworldindata). US datacenter energy use in 2023: 176 TWh (excluding crypto, source US congress). How much of that is AI, I couldn't find.
So that's 67Mt CO2, I hope I haven't misplaced my decimal point, please double check. That would be 1.3% of the 5Gt of CO2 the US emits per year.
For global emission and future trends the IEA estimates about 500TWh/year globally today, and 1000TWh/year in 2030 (base scenario). Assuming these use the current US grid carbon intensity, that would be about 200MtCO2 today, 400 in 2030. Global CO2 emissions today are 40Gt/year, so that would be 0.5% today, and 1% in 2030 (if global emissions stay stable).
What about renewables + battery storage? Does it take much longer to build? I can imagine getting a permit can take quite a long time, but what takes so long to set up solar panels and link them to batteries, without even having to connect them to the grid?
How many batteries is that? If we're talking solar and you have say a 300MW datacenter and you need it to operate for 12 hours without sun you need at least two of the largest battery install in the world[1] at 1700MWh. That doesn't factor cloudy days.
Reciprocating natural gas engines can be moved from [concrete] pad to pad and be up and running in under 24 hours. The portable turbines take longer but they’re still fast.
Acquiring enough solar panels and battery storage still takes a very long time by comparison.
The density required for solar is also much lower - the coordination between different land parcels and routing power and getting easements increases the time required vs. on prem gas turbines.
Takes much longer to build, requires a much larger up-front investment, and requires a lot more land.
The footprint needed when trying to generate this much power from solar or wind necessitates large-scale land acquisition plus the transmission infrastructure to get all that power to the actual data center, since you won't usually have enough land directly adjacent to it. That plus all the battery infrastructure makes it a non-starter for projects where short timescales are key.
More like it’s a really long way to say the government has utterly failed at making sure electricity generation and transmission capacity keeps up with demand so datacenters have been forced to get creative with alternative ways to power themselves. These companies absolutely want to use renewable energy from the power grid but the government blew it.
> This is a really long way of saying "We need to burn fossil fuels to make more money."
Like every other industry in the world?
I’m kind of amazed that AI data centers have become the political talking point for topics like water usage and energy use when they’re just doing what every other energy-intensive industry does. The food arriving at your grocery store and the building materials that built your house also came from industries that consume a lot of fossil fuels to make more money.
The problem is that most of the AI labs are popping up in TX that has a uniquely isolated electrical grid. Recall how the Texas cold snap a few years ago took down the grid for days. Turns out if you make a grid based on short term profit motifs, it's not going to be flexible enough to take new demand.
It's not the grid's technological limitation. We could have lived in a world with a more connected grid, more nibble utility commissions, and a lot less methane/carbon emissions as a result of it
Really cool in depth report, thanks for sharing. It's very interesting to see what these big datacenter deployments are actually doing. Go look at the oil price charts for the last 25 years and you'll see why it makes a ton of sense economically.
I also love how you can see the physical evidence of them pitting jurisdictions against each other from the satellite photos with the data center on one side of a state border and the power generation on the other.
Why is no one talking about the "other grid" capacity here?
Natural gas at this scale cannot be delivered by truck. It's piped in direct from fields, typically.
When do we run out of natural gas "grid" capacity in these locations? I can't imagine we're that overbuilt compared to the electrical grid itself?
The big freeze in Texas is a recent example of the natural gas grid having localized "brownouts" due to a few factors - one of which being the demand of all the natural gas peakers trying to fire at once.
Seems like this is the next infrastructure piece to have a supply crunch to me? There are places (North Dakota) so contranstrained by capacity to deliver gas to the "grid" that they simply flare it off because it's cheaper to pay the government to do that vs. lay pipe. This implies to me that natural gas is about to become more valuable.
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
This seems like a big reach for me. Their largest engine (and it is absolutely massive) "only" produces 80MW of power. The Brayton cycle is unbeatable if you need to keep scaling power up to ridiculous levels.
I mean, the claim is certainly nonsensical in the sense that this isn't something Wärtsilä just "realized". They have been in the power plant business for decades. In the oldest financials they have online (the annual report for year 2000) their power plant sales are larger than their marine engine sales.
Really makes me wonder about anything else I've read on Semianalysis. Like, it is such an insane thing to claim and so easy to check. And they just wrote it anyway, like some kind of pathological fabulists.
But what's the part that seems like a "big reach"? Are you saying they didn't sign those contracts? That their customers are making a mistake?
I often like SemiAnalysis' work, but there's parts of this article that are shockingly under-researched and completely missing critical parts of the narrative.
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive.
> Again, clever firms like xAI have found remedies. Elon's AI Lab even pioneered a new site selection process - building at the border of two states to maximize the odds of getting a permit early!
The energy strategy was to completely and almost certainly illegally bypass permitting and ignore the Clean Air Act, at a tangible cost to the surrounding community by measurably increasing respiratory irritants like NOx in the air around these communities. Characterizing this harm as "clever" is wildly irresponsible, and it's wild that the word "illegal" doesn't appear in the article once, while at the same time handwaving the fact that permitting for local combustion-based generation (for these reasons!) is one of the main factors to pushing out timelines and increasing cost.
Here's my guess: there are lots of datacenters being built in Virginia, Pennsylvania, Indiana, Ohio, Illinois [1]. Also in Texas, Georgia, Arizona, Nevada and Utah.
I think the first 5 states have this in common: there are lots of coal burning power plants that were shut down, but can be restarted and hooked to the grid on a relatively short notice. The grid is also quite good in this region.
In Texas, it is likely that new power can be generated with a combination of solar, wind, gas, and fast permitting.
I don't have an explanation for Georgia.
For Arizona, and perhaps Nevada and Utah too, I think it is likely to be solar.
Don't know about the others, but Illinois permanently shut down (and demolished or repurposed the land) the majority of its coal power plants over the past couple decades.
Illinois gets about half its power from nuclear (we have 6 plants and 11 reactors), followed by natural gas at around 20%, and then about equal amounts of coal and wind, at around 10-15%.
So Illinois is actually a pretty decent place to build datacenters, from a clean power generation perspective.
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive. xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines.
Wow, "truck-mounted gas turbines"? Who else could have mastered such a futuristic tech in so short a time? Seriously, who wrote this? Grok? And let's ignore that this needless burning of fossil fuel is making life on Earth harder for everyone and everything else.
I'm no fan of Musk, but you've got to admit it was a clever way to achieve the goal. SemiAnalysis don't do fanboy articles - their research is pretty in-depth. So they are stating it as they see it.
The problem ordinary people all over the world have is that governments are allowing this to happen. Maybe if there were stricter regulation it will prevent players such as Musk to come up with such "innovations".
And all without the proper permits! Using 35 generators when they were only allowed 15! Yay! So glad we're allowing AI companies to break law after law after law to not be able to reason logically the basic Towers of Hanoi.
Boom’s pivot to trying to build turbines for data centers wasn’t surprising when data center deployments started using turbines. Either their CEO saw one of the headlines or their investors forwarded it over and it became their new talking point.
What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.
There's not a single mention of pollution or clean energy or the environment in the entire article. Presumably the regulatory requirements for these generators are less stringent than for proper power plants, so the costs are pushed onto the rest of society (having to deal with the environmental impact) while Microsoft et al. keep the profits?
I think it's funny that at no point in the article do they mention the idea of simply making LLMs more efficient. I guess that's not important when all you care about is winning the AI "race" rather then selling a long term sustainable product.
What makes you think that the entire process isn't being made more efficient? There are entire papers dedicated to pulling out more FLOPs from GPUs so that less energy is being wasted on simply moving memory around. Of course, there's also inference side optimizations like speculative decoding and MoE. Some of these make the training process more expensive.
The other big problem is that you can always increase the scale to compensate for the energy efficiency. I do wonder if they'll eventually level this off though. If performance somehow plateaus then presumably the efficiency gains will catch up. That being said, that doesn't seem to be a thing in the near future.
Cool article. It's pretty weird that the writer seems to attribute all decisions by xAI to Mr. Musk personally. I doubt he is closely involved in such technical projects.
> However, AI infrastructure cannot wait for the grid’s multiyear transmission upgrades. An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually. Getting a 400 MW datacenter online even six months earlier is worth billions. Economic need dwarfs problems like an overloaded electric grid. The industry is already searching for new solutions.
wow, that's some logic. Environmentally unsound means of extracting energy directly damage the ecosystem in which humans need to live. The need for a functioning ecosystem "dwarfs" "problems" like billionaires not making enough billions. Fixing a ruined ecosystem would cost many more billions than whatever economic revenue the AI generated while ruining it. So if you're not harnessing the sun or wind (forget about the latter in the US right now, btw), you're burning things, and you can get lost with that.
This kind of short sighted thinking is because when folks like this talk about generating billions of dollars of worth, their cerebellums are firing up as they think of themselves personally as billionaires, corrupting their overall thought processes. We really need to tax billionaires out of existence.
This is coming from a group that does analysis on the semiconductor and cloud industries and provided very expensive access to their models and info. They are the citation.
Even if that’s true, that seems like a putrid number, no?
Assuming a single 1GW the data center runs 24/7 365, it’s consuming 8.76 TwH per year. Only being able to generate $10-$12B in revenue (not profit) per year while consuming as much electricity as the entire state of Hawaii (1.5M people) seems awful.
If you do the math, that's $10-$12 per watt year. There's approx 24×365.25=8766 hours in a year, so assuming that the datacenters would be running 24×7, that boils down to $1.14 to $1.37 in revenue per kWh. That's not a bad deal if power really is a major part of the expense.
Yeah I guess I'm not the target audience for this because I assumed that "the power problem" was "massive increase in electricity costs for people despite virtually unchanged usage on their part", not "AI companies have to wait too long to be able to start using even more power than they already are":
> Nicole Pastore, who has lived in her large stone home near Baltimore’s Johns Hopkins University campus for 18 years, said her utility bills over the past year jumped by 50%. “You look at that and think, ‘Oh my god,’” she said. She has now become the kind of mom who walks around her home turning off lights and unplugging her daughter’s cellphone chargers.
> And because Pastore is a judge who rules on rental disputes in Baltimore City District Court, she regularly sees poor people struggling with their own power bills. “It’s utilities versus rent,” she said. “They want to stay in their home, but they also want to keep their lights on.”
And the air quality around these plants is poor, leading to health problems for the neighbors.
This short term, destructive, thinking should be criminalized.
I think it's time to discuss changing the incentives around ai deployment, specifically paying into a ubi fund whenever human jobs are replaced by ai. Musk himself raised the idea.
The natural gas turbines used are relatively efficient as far as engines go. Having them on-site makes transmission losses basically negligible.
Nothing short of full solar connected to batteries produced without any difficult to mine elements will make some people happy, but as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.
And imagine all this poorly located, overpriced, haphazardly thrown together and polluting infrastructure will basically get flushed down the toilet once either the AI bubble pops, or they figure out a new way of doing AI that doesn't require terawatts of power.
The dialog around AI resource use is frustratingly inane, because the benefits are never discussed in the same context.
LLMs/diffusers are inefficient from a traditional computing perspective, but they are also the most efficient technology humanity has created:
> AI systems (ChatGPT, BLOOM, DALL-E2, Midjourney) and human individuals performing equivalent writing and illustrating tasks. Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts.
g8oz|2 months ago
These generators polluted the nearby historically black neighborhoods in Memphis Tennessee with nitrogen oxides. Residents are afraid to open their windows, with the elderly, children and those suffering from conditions like COPD particularly affected. Lawsuits alleging environmental racism are pending.
xAI says cleaner generators will be installed but I think this episode shows that we cannot allow public interests to be compromised by private sector so easily just because they scream: Jobs! Investment!
https://time.com/7308925/elon-musk-memphis-ai-data-center/
thisgetsit|2 months ago
80ish% in the US live <100 miles from their hometown.
It would be wise to see "jobs!" Investment!" as little more than a mafioso like threat to agrarian-stay in one place-work to live types. "Sure is a nice Shire you got there. Better hope it doesn't suffer from lack of investment in jobs."
Threats of it all imploding are taken seriously by a lot of people.
https://www.mentalfloss.com/culture/generations/millennials-...
So what if it does? That's normal with the passage of time. As long as human biology exists humans will solve for those problems. Beyond that obligation is just socialized memes, ethno objects that come and go with the generations.
Everyone alive now worried about propagation of our culture sure does not seem concerned Latin fell out of common use. That they aren't spending their lives keeping old traditions alive should make it obvious old traditions don't mean that much to the living.
Politicians and rich need us servicing debt they so graciously took on to invest in jobs or we would be free to police them.
mucle6|2 months ago
It implies that if this were happening near a non black neighborhood, it wouldn’t be as egregious, which is a strange moral stance.
Also 'historically' is irrelevant. Pollution hurts the people living there now.
ACCount37|2 months ago
I'm incredibly skeptical of any claim that xAI's power use is putting a dent in the local environment, and "environmental racism" just reeks of the usual agenda pushing.
cl0ckt0wer|2 months ago
badc0ffee|2 months ago
My city has a big NG facility downtown that pipes heated water to a bunch of buildings, and it is surrounded by condos. I've never heard anything about it impacting the air (other than CO2 which is a global and not local issue).
Every building here (except for those connected to district heating systems), large and small, has a natural gas boiler or furnace. We have also several NG plants generating electricity within city limits. Again, localized pollution is not what concerns people about these things. Coal plants, on the other hand, tended to be way outside the city when they were still in operation.
renewiltord|2 months ago
[deleted]
mhb|2 months ago
casey2|2 months ago
roxolotl|2 months ago
bespokedevelopr|2 months ago
My takeaway is they get it correct enough but no deep insight on the power generation industry.
I was surprised by and learned a few things from the article though. Definitely gives me some ideas of reaching out to old contacts to see if there’s any opportunities with building models and analytics for the new demands.
Focusing on Bloom is fun because they’re new and startup vibes but Innio and cat are really having a resurgence of demand with their generators and building diesel/natg engines is much simpler than gas turbines. I’m sure the heads at GE wish they hadn’t sold that off now.
On steam/gas turbine blade manufacturing there most certainly are more big players than 4 and many US based. You have to remember this is an old industry with existing supply chains and maintenance companies.
As long as the demand for new data centers doesn’t lose steam these onsite options will continue to flourish. Fed grid access builds are currently a 10+ year wait and they are reworking the system to be “fast”, only 5-6 years for build outs now. They’re also changing how the bidding process works which was touched on here. You need skin in the game if you want to be taken seriously now. There’s so many requests from companies arbing who can give them the best deal/timeline. Now you need to put money up if you even want a call back.
trhway|2 months ago
siliconc0w|2 months ago
on_the_grind|2 months ago
[deleted]
pdpi|2 months ago
So the benchmark is achieving human-like intelligence on a 100W budget. I'd be very curious to see what can be achieved by AI targeting that power budget.
crazygringo|2 months ago
Similarly, I've had times where it wrote me scientific simulation code that would take me 2 days, in around a minute.
Obviously I'm cherry-picking the best examples, but I would guess that overall, the energy usage my LLM queries have required is vastly less than my own biological energy usage if I did the equivalent work on my own. Plus it's not just the energy to run my body -- it's the energy to house me, heat my home, transport my groceries, and so forth. People have way more energy needs than just the kilocalories that fuel them.
If you're using AI productively, I assume it's already much more energy-efficient than the energy footprint of a human for the same amount of work.
exitb|2 months ago
Magnets|2 months ago
rixed|2 months ago
roflmaostc|2 months ago
A computer uses orders of magnitude less energy than a human.
It's all about the task, humans are specialized too.
EDIT: maybe add a logarithm or other non-linear functions to make the gap even bigger.
redox99|2 months ago
I agree human brains are crazy efficient though.
saagarjha|2 months ago
FergusArgyll|2 months ago
windexh8er|2 months ago
> An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.
What? I let ChatGPT swag an answer on the revenue forecast and it cited $2-6B rev per GW year.
And then we get this gem...
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
So now we're going to be spewing ~486 g CO₂e per kWh using something that wasn't designed to run 24/7/365 to handle these workloads? These datacenters choosing to use these forms of power should have to secure a local vote showcasing, and being held to, annual measurements of NOx, CO, VOC and PM.
This article just showcases all the horrible bandaids being applied to procure energy in any way possible with little regard to health or environmental impact.
libraryofbabel|2 months ago
Yes, all sources are biased, but some are useful. And I know that it's hard to get solid data on this from AI companies, but we must have at least a rough estimate?
Please don't tell me to ask ChatGPT about it :)
lefra|2 months ago
So that's 67Mt CO2, I hope I haven't misplaced my decimal point, please double check. That would be 1.3% of the 5Gt of CO2 the US emits per year.
https://ourworldindata.org/grapher/carbon-intensity-electric...
https://www.congress.gov/crs-product/R48646#_Toc207199546
For global emission and future trends the IEA estimates about 500TWh/year globally today, and 1000TWh/year in 2030 (base scenario). Assuming these use the current US grid carbon intensity, that would be about 200MtCO2 today, 400 in 2030. Global CO2 emissions today are 40Gt/year, so that would be 0.5% today, and 1% in 2030 (if global emissions stay stable).
https://www.iea.org/data-and-statistics/charts/global-data-c...
unknown|2 months ago
[deleted]
thatfrenchguy|2 months ago
memoriuaysj|2 months ago
pingou|2 months ago
O5vYtytb|2 months ago
[1] https://www.heise.de/en/news/850-MW-World-s-largest-battery-...
Aurornis|2 months ago
Acquiring enough solar panels and battery storage still takes a very long time by comparison.
condensedcrab|2 months ago
thunderbird120|2 months ago
The footprint needed when trying to generate this much power from solar or wind necessitates large-scale land acquisition plus the transmission infrastructure to get all that power to the actual data center, since you won't usually have enough land directly adjacent to it. That plus all the battery infrastructure makes it a non-starter for projects where short timescales are key.
memoriuaysj|2 months ago
codingdave|2 months ago
It didn't make long-term sense for our world before AI. It makes no more sense with AI.
HDThoreaun|2 months ago
Aurornis|2 months ago
Like every other industry in the world?
I’m kind of amazed that AI data centers have become the political talking point for topics like water usage and energy use when they’re just doing what every other energy-intensive industry does. The food arriving at your grocery store and the building materials that built your house also came from industries that consume a lot of fossil fuels to make more money.
a1371|2 months ago
It's not the grid's technological limitation. We could have lived in a world with a more connected grid, more nibble utility commissions, and a lot less methane/carbon emissions as a result of it
twerka-stonk|2 months ago
However, it is worth saying that xAI’s “solution” was illegal, unhealthy for the local constituents, and stinks of corruption, https://insideclimatenews.org/news/17072025/elon-musk-xai-da....
symbogra|2 months ago
I also love how you can see the physical evidence of them pitting jurisdictions against each other from the satellite photos with the data center on one side of a state border and the power generation on the other.
phil21|2 months ago
Why is no one talking about the "other grid" capacity here?
Natural gas at this scale cannot be delivered by truck. It's piped in direct from fields, typically.
When do we run out of natural gas "grid" capacity in these locations? I can't imagine we're that overbuilt compared to the electrical grid itself?
The big freeze in Texas is a recent example of the natural gas grid having localized "brownouts" due to a few factors - one of which being the demand of all the natural gas peakers trying to fire at once.
Seems like this is the next infrastructure piece to have a supply crunch to me? There are places (North Dakota) so contranstrained by capacity to deliver gas to the "grid" that they simply flare it off because it's cheaper to pay the government to do that vs. lay pipe. This implies to me that natural gas is about to become more valuable.
teknopaul|2 months ago
*greed.
We are well past the point that any economic growth at all is anything but a distribution of income problem.
bob1029|2 months ago
This seems like a big reach for me. Their largest engine (and it is absolutely massive) "only" produces 80MW of power. The Brayton cycle is unbeatable if you need to keep scaling power up to ridiculous levels.
jsnell|2 months ago
Really makes me wonder about anything else I've read on Semianalysis. Like, it is such an insane thing to claim and so easy to check. And they just wrote it anyway, like some kind of pathological fabulists.
But what's the part that seems like a "big reach"? Are you saying they didn't sign those contracts? That their customers are making a mistake?
g8oz|2 months ago
qchris|2 months ago
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive.
> Again, clever firms like xAI have found remedies. Elon's AI Lab even pioneered a new site selection process - building at the border of two states to maximize the odds of getting a permit early!
The energy strategy was to completely and almost certainly illegally bypass permitting and ignore the Clean Air Act, at a tangible cost to the surrounding community by measurably increasing respiratory irritants like NOx in the air around these communities. Characterizing this harm as "clever" is wildly irresponsible, and it's wild that the word "illegal" doesn't appear in the article once, while at the same time handwaving the fact that permitting for local combustion-based generation (for these reasons!) is one of the main factors to pushing out timelines and increasing cost.
[1] https://time.com/7308925/elon-musk-memphis-ai-data-center/
[2] https://www.selc.org/news/resistance-against-elon-musks-xai-...
[3] https://naacp.org/articles/elon-musks-xai-threatened-lawsuit...
bugglebeetle|2 months ago
mikelitoris|2 months ago
credit_guy|2 months ago
I think the first 5 states have this in common: there are lots of coal burning power plants that were shut down, but can be restarted and hooked to the grid on a relatively short notice. The grid is also quite good in this region.
In Texas, it is likely that new power can be generated with a combination of solar, wind, gas, and fast permitting.
I don't have an explanation for Georgia.
For Arizona, and perhaps Nevada and Utah too, I think it is likely to be solar.
[1] https://www.axios.com/2025/12/18/data-center-growth-map-stat...
aschla|2 months ago
Illinois gets about half its power from nuclear (we have 6 plants and 11 reactors), followed by natural gas at around 20%, and then about equal amounts of coal and wind, at around 10-15%.
So Illinois is actually a pretty decent place to build datacenters, from a clean power generation perspective.
https://www.eia.gov/state/analysis.php?sid=IL
geetee|2 months ago
thrance|2 months ago
Wow, "truck-mounted gas turbines"? Who else could have mastered such a futuristic tech in so short a time? Seriously, who wrote this? Grok? And let's ignore that this needless burning of fossil fuel is making life on Earth harder for everyone and everything else.
sam-cop-vimes|2 months ago
The problem ordinary people all over the world have is that governments are allowing this to happen. Maybe if there were stricter regulation it will prevent players such as Musk to come up with such "innovations".
sameesh|2 months ago
So they solved the power problem by consuming more fossil fuel. Got it.
miltonlost|2 months ago
https://techcrunch.com/2025/07/03/xai-gets-permits-for-15-na...
Symmetry|2 months ago
leetrout|2 months ago
https://qz.com/boom-supersonic-jet-startup-ai-data-center-po...
Aurornis|2 months ago
What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.
torginus|2 months ago
unknown|2 months ago
[deleted]
ptx|1 month ago
tehjoker|2 months ago
AkelaA|2 months ago
redox99|2 months ago
And yes of course it's a race, everything being equal nobody's going to use your model if someone else has a better model.
cl0ckt0wer|2 months ago
inkysigma|2 months ago
The other big problem is that you can always increase the scale to compensate for the energy efficiency. I do wonder if they'll eventually level this off though. If performance somehow plateaus then presumably the efficiency gains will catch up. That being said, that doesn't seem to be a thing in the near future.
deflator|1 month ago
zzzeek|2 months ago
wow, that's some logic. Environmentally unsound means of extracting energy directly damage the ecosystem in which humans need to live. The need for a functioning ecosystem "dwarfs" "problems" like billionaires not making enough billions. Fixing a ruined ecosystem would cost many more billions than whatever economic revenue the AI generated while ruining it. So if you're not harnessing the sun or wind (forget about the latter in the US right now, btw), you're burning things, and you can get lost with that.
This kind of short sighted thinking is because when folks like this talk about generating billions of dollars of worth, their cerebellums are firing up as they think of themselves personally as billionaires, corrupting their overall thought processes. We really need to tax billionaires out of existence.
dzonga|2 months ago
was why not solar ? Yeah Hydrocarbons have no competition if you have to deploy power quickly
1.2GW is a small turbine - compared to the land & battery needed for Solar.
how about Gas ? if you're building in the middle of nowhere ? & there's no gas lines ?
cl0ckt0wer|2 months ago
Apreche|2 months ago
Citation needed.
Aurornis|2 months ago
strange_quark|2 months ago
Assuming a single 1GW the data center runs 24/7 365, it’s consuming 8.76 TwH per year. Only being able to generate $10-$12B in revenue (not profit) per year while consuming as much electricity as the entire state of Hawaii (1.5M people) seems awful.
zozbot234|2 months ago
mikelitoris|2 months ago
dkobia|2 months ago
That said, it is all pretty impressive.
goda90|2 months ago
Natural Gas supply problem: worsened
Carbon in the atmosphere problem: worsened
saghm|2 months ago
> Nicole Pastore, who has lived in her large stone home near Baltimore’s Johns Hopkins University campus for 18 years, said her utility bills over the past year jumped by 50%. “You look at that and think, ‘Oh my god,’” she said. She has now become the kind of mom who walks around her home turning off lights and unplugging her daughter’s cellphone chargers.
> And because Pastore is a judge who rules on rental disputes in Baltimore City District Court, she regularly sees poor people struggling with their own power bills. “It’s utilities versus rent,” she said. “They want to stay in their home, but they also want to keep their lights on.”
https://www.bloomberg.com/graphics/2025-ai-data-centers-elec...
imglorp|2 months ago
This short term, destructive, thinking should be criminalized.
I think it's time to discuss changing the incentives around ai deployment, specifically paying into a ubi fund whenever human jobs are replaced by ai. Musk himself raised the idea.
https://www.indexbox.io/blog/tech-leaders-push-for-universal...
einrealist|2 months ago
Aurornis|2 months ago
Nothing short of full solar connected to batteries produced without any difficult to mine elements will make some people happy, but as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.
protimewaster|2 months ago
Not so.
corimaith|2 months ago
torginus|2 months ago
seydor|2 months ago
PunchyHamster|2 months ago
josefritzishere|2 months ago
biddit|2 months ago
LLMs/diffusers are inefficient from a traditional computing perspective, but they are also the most efficient technology humanity has created:
> AI systems (ChatGPT, BLOOM, DALL-E2, Midjourney) and human individuals performing equivalent writing and illustrating tasks. Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts.
Source: https://www.nature.com/articles/s41598-024-54271-x