top | item 46444020

How AI labs are solving the power problem

167 points| Symmetry | 2 months ago |newsletter.semianalysis.com

258 comments

order

g8oz|2 months ago

>>xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines.

These generators polluted the nearby historically black neighborhoods in Memphis Tennessee with nitrogen oxides. Residents are afraid to open their windows, with the elderly, children and those suffering from conditions like COPD particularly affected. Lawsuits alleging environmental racism are pending.

xAI says cleaner generators will be installed but I think this episode shows that we cannot allow public interests to be compromised by private sector so easily just because they scream: Jobs! Investment!

https://time.com/7308925/elon-musk-memphis-ai-data-center/

thisgetsit|2 months ago

> xAI says cleaner generators will be installed but I think this episode shows that we cannot allow public interests to be compromised by private sector so easily just because they scream: Jobs! Investment!

80ish% in the US live <100 miles from their hometown.

It would be wise to see "jobs!" Investment!" as little more than a mafioso like threat to agrarian-stay in one place-work to live types. "Sure is a nice Shire you got there. Better hope it doesn't suffer from lack of investment in jobs."

Threats of it all imploding are taken seriously by a lot of people.

https://www.mentalfloss.com/culture/generations/millennials-...

So what if it does? That's normal with the passage of time. As long as human biology exists humans will solve for those problems. Beyond that obligation is just socialized memes, ethno objects that come and go with the generations.

Everyone alive now worried about propagation of our culture sure does not seem concerned Latin fell out of common use. That they aren't spending their lives keeping old traditions alive should make it obvious old traditions don't mean that much to the living.

Politicians and rich need us servicing debt they so graciously took on to invest in jobs or we would be free to police them.

mucle6|2 months ago

The phrasing 'historically black neighborhoods' feels like it pushes a specific agenda rather than just addressing the pollution.

It implies that if this were happening near a non black neighborhood, it wouldn’t be as egregious, which is a strange moral stance.

Also 'historically' is irrelevant. Pollution hurts the people living there now.

ACCount37|2 months ago

These "gas turbines" are located next door to the Allen Combined Cycle Plant, a grid scale natural gas power plant with 1.1GW capacity. It's there to power a nearby steel mill. That's the kind of neighborhood xAI has put its cluster in.

I'm incredibly skeptical of any claim that xAI's power use is putting a dent in the local environment, and "environmental racism" just reeks of the usual agenda pushing.

cl0ckt0wer|2 months ago

We can sue to shut down pollution generators? Finally, I can get rid of that annoying airport...

badc0ffee|2 months ago

I'm a bit skeptical about this. I know diesel generators make these kind of pollutants, but I haven't heard the same about natural gas.

My city has a big NG facility downtown that pipes heated water to a bunch of buildings, and it is surrounded by condos. I've never heard anything about it impacting the air (other than CO2 which is a global and not local issue).

Every building here (except for those connected to district heating systems), large and small, has a natural gas boiler or furnace. We have also several NG plants generating electricity within city limits. Again, localized pollution is not what concerns people about these things. Coal plants, on the other hand, tended to be way outside the city when they were still in operation.

mhb|2 months ago

Why is the skin tone of the residents of the affected community relevant?

casey2|2 months ago

TFA said it's all legal and explicit federal policy. You don't have to like it, but some people are going to have to make minor sacrifices if the majority want AI services. Look on the bright side, when these people all have personal robot doctors caring for them well into their 100s they will be grateful they didn't listen to the NIMBYs

roxolotl|2 months ago

It’s cute they describe this as a solution to _the_ power problem. It’s a solution to _their_ power problem. We have a grid problem. This massive amount of investment would be an incredible time to do something about it. Instead we’ve got an administration hostile to modern energy solutions and an industry hostile to everyone. Really depressing to see all this money go up in smoke in such a massive short sighted rush.

bespokedevelopr|2 months ago

I previously worked directly for some of the power generation manufacturers listed in the article and later on the grid/power transmission side.

My takeaway is they get it correct enough but no deep insight on the power generation industry.

I was surprised by and learned a few things from the article though. Definitely gives me some ideas of reaching out to old contacts to see if there’s any opportunities with building models and analytics for the new demands.

Focusing on Bloom is fun because they’re new and startup vibes but Innio and cat are really having a resurgence of demand with their generators and building diesel/natg engines is much simpler than gas turbines. I’m sure the heads at GE wish they hadn’t sold that off now.

On steam/gas turbine blade manufacturing there most certainly are more big players than 4 and many US based. You have to remember this is an old industry with existing supply chains and maintenance companies.

As long as the demand for new data centers doesn’t lose steam these onsite options will continue to flourish. Fed grid access builds are currently a 10+ year wait and they are reworking the system to be “fast”, only 5-6 years for build outs now. They’re also changing how the bidding process works which was touched on here. You need skin in the game if you want to be taken seriously now. There’s so many requests from companies arbing who can give them the best deal/timeline. Now you need to put money up if you even want a call back.

trhway|2 months ago

so we had some onsite generation moves from the lower end - residential solar, etc - and now we have it from the higher end - fossil fuel generation at datacenters. If that creates high efficiency generators then that may drive "onsite" further into the mid-segment. That may also affect the grid role nudging it from hierarchical delivery to network sharing/rebalancing, and may even lead to separate local grids (like 100+ years ago). That also would give fossil fuels new demand (and also would be a market for small/compact nuclear). Kind of disintegration wave.

siliconc0w|2 months ago

Kinda proving that these are a bad deal for communities - very few jobs and tax revenues, but enjoy the increased asthma and cancer we all get to pay for.

pdpi|2 months ago

Part of what bothers me with AI energy consumption isn't just how wasteful it might be from an ecological perspective, it's how brutally inefficient it is compared to the biological "state of the art" — 2000kcal = 8,368 kJ. 8,368 kJ / 86,400 s = 96.9 W.

So the benchmark is achieving human-like intelligence on a 100W budget. I'd be very curious to see what can be achieved by AI targeting that power budget.

crazygringo|2 months ago

Is it though? When I ask an LLM research questions, it often answers in 20 seconds what it would take me an entire afternoon to figure out with traditional research.

Similarly, I've had times where it wrote me scientific simulation code that would take me 2 days, in around a minute.

Obviously I'm cherry-picking the best examples, but I would guess that overall, the energy usage my LLM queries have required is vastly less than my own biological energy usage if I did the equivalent work on my own. Plus it's not just the energy to run my body -- it's the energy to house me, heat my home, transport my groceries, and so forth. People have way more energy needs than just the kilocalories that fuel them.

If you're using AI productively, I assume it's already much more energy-efficient than the energy footprint of a human for the same amount of work.

exitb|2 months ago

How so? A human needs the entire civilisation to be productive at that level. If you take a just the entire US electricity consumption and divide it by its population, you'll get a result that's an order of magnitude higher. And that's just electricity. And that's just domestic consumption, even though US Americans consume tons of foreign-made goods.

Magnets|2 months ago

you didn't consider the 18+ years we have with almost no productivity and the extra resources required to sustain life

rixed|2 months ago

Ah! And don't get me started about how specific its energy source must be! Pure electricity, no less! Where a human brain comes attached with an engine that can power it for days on a mere ham sandwich!

roflmaostc|2 months ago

try to calculate 12312312.123213 * 123123.3123123

A computer uses orders of magnitude less energy than a human.

It's all about the task, humans are specialized too.

EDIT: maybe add a logarithm or other non-linear functions to make the gap even bigger.

redox99|2 months ago

How much energy did evolution "spend" to get us here?

I agree human brains are crazy efficient though.

saagarjha|2 months ago

That’s about the energy a laptop or two uses at full tilt.

FergusArgyll|2 months ago

You can't compare a training run that produces a file which can be run forever after to a human day

windexh8er|2 months ago

Beyond wasteful the linked article can't even remotely be taken seriously.

> An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.

What? I let ChatGPT swag an answer on the revenue forecast and it cited $2-6B rev per GW year.

And then we get this gem...

> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.

So now we're going to be spewing ~486 g CO₂e per kWh using something that wasn't designed to run 24/7/365 to handle these workloads? These datacenters choosing to use these forms of power should have to secure a local vote showcasing, and being held to, annual measurements of NOx, CO, VOC and PM.

This article just showcases all the horrible bandaids being applied to procure energy in any way possible with little regard to health or environmental impact.

libraryofbabel|2 months ago

Does anyone know a really good source for basic information estimating what % of global carbon emissions come from AI training and AI inference, both 1) now and 2) in the future if we believe AI companies' capacity projections? I would really like to read a detailed analysis of this avoids both AI hype and anti-AI hysteria. It's an important question but it excites strong reactions that tend to cloud the facts.

Yes, all sources are biased, but some are useful. And I know that it's hard to get solid data on this from AI companies, but we must have at least a rough estimate?

Please don't tell me to ask ChatGPT about it :)

lefra|2 months ago

US grid carbon intensity is 0.384 gCO2/kWh (source: ourworldindata). US datacenter energy use in 2023: 176 TWh (excluding crypto, source US congress). How much of that is AI, I couldn't find.

So that's 67Mt CO2, I hope I haven't misplaced my decimal point, please double check. That would be 1.3% of the 5Gt of CO2 the US emits per year.

https://ourworldindata.org/grapher/carbon-intensity-electric...

https://www.congress.gov/crs-product/R48646#_Toc207199546

For global emission and future trends the IEA estimates about 500TWh/year globally today, and 1000TWh/year in 2030 (base scenario). Assuming these use the current US grid carbon intensity, that would be about 200MtCO2 today, 400 in 2030. Global CO2 emissions today are 40Gt/year, so that would be 0.5% today, and 1% in 2030 (if global emissions stay stable).

https://www.iea.org/data-and-statistics/charts/global-data-c...

thatfrenchguy|2 months ago

Our kids are not going to be happy we spun up more CO2 generation for this.

memoriuaysj|2 months ago

their uploaded minds will enjoy the infinite AI slop though

pingou|2 months ago

What about renewables + battery storage? Does it take much longer to build? I can imagine getting a permit can take quite a long time, but what takes so long to set up solar panels and link them to batteries, without even having to connect them to the grid?

O5vYtytb|2 months ago

How many batteries is that? If we're talking solar and you have say a 300MW datacenter and you need it to operate for 12 hours without sun you need at least two of the largest battery install in the world[1] at 1700MWh. That doesn't factor cloudy days.

[1] https://www.heise.de/en/news/850-MW-World-s-largest-battery-...

Aurornis|2 months ago

Reciprocating natural gas engines can be moved from [concrete] pad to pad and be up and running in under 24 hours. The portable turbines take longer but they’re still fast.

Acquiring enough solar panels and battery storage still takes a very long time by comparison.

condensedcrab|2 months ago

The density required for solar is also much lower - the coordination between different land parcels and routing power and getting easements increases the time required vs. on prem gas turbines.

thunderbird120|2 months ago

Takes much longer to build, requires a much larger up-front investment, and requires a lot more land.

The footprint needed when trying to generate this much power from solar or wind necessitates large-scale land acquisition plus the transmission infrastructure to get all that power to the actual data center, since you won't usually have enough land directly adjacent to it. That plus all the battery infrastructure makes it a non-starter for projects where short timescales are key.

memoriuaysj|2 months ago

land. compute what surface you need for 1 GW of solar

codingdave|2 months ago

This is a really long way of saying "We need to burn fossil fuels to make more money."

It didn't make long-term sense for our world before AI. It makes no more sense with AI.

HDThoreaun|2 months ago

More like it’s a really long way to say the government has utterly failed at making sure electricity generation and transmission capacity keeps up with demand so datacenters have been forced to get creative with alternative ways to power themselves. These companies absolutely want to use renewable energy from the power grid but the government blew it.

Aurornis|2 months ago

> This is a really long way of saying "We need to burn fossil fuels to make more money."

Like every other industry in the world?

I’m kind of amazed that AI data centers have become the political talking point for topics like water usage and energy use when they’re just doing what every other energy-intensive industry does. The food arriving at your grocery store and the building materials that built your house also came from industries that consume a lot of fossil fuels to make more money.

a1371|2 months ago

The problem is that most of the AI labs are popping up in TX that has a uniquely isolated electrical grid. Recall how the Texas cold snap a few years ago took down the grid for days. Turns out if you make a grid based on short term profit motifs, it's not going to be flexible enough to take new demand.

It's not the grid's technological limitation. We could have lived in a world with a more connected grid, more nibble utility commissions, and a lot less methane/carbon emissions as a result of it

symbogra|2 months ago

Really cool in depth report, thanks for sharing. It's very interesting to see what these big datacenter deployments are actually doing. Go look at the oil price charts for the last 25 years and you'll see why it makes a ton of sense economically.

I also love how you can see the physical evidence of them pitting jurisdictions against each other from the satellite photos with the data center on one side of a state border and the power generation on the other.

phil21|2 months ago

So all the predictable arguments aside...

Why is no one talking about the "other grid" capacity here?

Natural gas at this scale cannot be delivered by truck. It's piped in direct from fields, typically.

When do we run out of natural gas "grid" capacity in these locations? I can't imagine we're that overbuilt compared to the electrical grid itself?

The big freeze in Texas is a recent example of the natural gas grid having localized "brownouts" due to a few factors - one of which being the demand of all the natural gas peakers trying to fire at once.

Seems like this is the next infrastructure piece to have a supply crunch to me? There are places (North Dakota) so contranstrained by capacity to deliver gas to the "grid" that they simply flare it off because it's cheaper to pay the government to do that vs. lay pipe. This implies to me that natural gas is about to become more valuable.

teknopaul|2 months ago

Economic * need dwarfs problems like an overloaded electric grid.

*greed.

We are well past the point that any economic growth at all is anything but a distribution of income problem.

bob1029|2 months ago

> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.

This seems like a big reach for me. Their largest engine (and it is absolutely massive) "only" produces 80MW of power. The Brayton cycle is unbeatable if you need to keep scaling power up to ridiculous levels.

jsnell|2 months ago

I mean, the claim is certainly nonsensical in the sense that this isn't something Wärtsilä just "realized". They have been in the power plant business for decades. In the oldest financials they have online (the annual report for year 2000) their power plant sales are larger than their marine engine sales.

Really makes me wonder about anything else I've read on Semianalysis. Like, it is such an insane thing to claim and so easy to check. And they just wrote it anyway, like some kind of pathological fabulists.

But what's the part that seems like a "big reach"? Are you saying they didn't sign those contracts? That their customers are making a mistake?

g8oz|2 months ago

They likely use multiple engines.

qchris|2 months ago

I often like SemiAnalysis' work, but there's parts of this article that are shockingly under-researched and completely missing critical parts of the narrative.

> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive.

> Again, clever firms like xAI have found remedies. Elon's AI Lab even pioneered a new site selection process - building at the border of two states to maximize the odds of getting a permit early!

The energy strategy was to completely and almost certainly illegally bypass permitting and ignore the Clean Air Act, at a tangible cost to the surrounding community by measurably increasing respiratory irritants like NOx in the air around these communities. Characterizing this harm as "clever" is wildly irresponsible, and it's wild that the word "illegal" doesn't appear in the article once, while at the same time handwaving the fact that permitting for local combustion-based generation (for these reasons!) is one of the main factors to pushing out timelines and increasing cost.

[1] https://time.com/7308925/elon-musk-memphis-ai-data-center/

[2] https://www.selc.org/news/resistance-against-elon-musks-xai-...

[3] https://naacp.org/articles/elon-musks-xai-threatened-lawsuit...

bugglebeetle|2 months ago

It’s called “Semi” analysis for a reason. Dylan Patel is the Jim Cramer of industry reporting for this sector.

mikelitoris|2 months ago

More appropriate word is “sly” not “clever”.

credit_guy|2 months ago

Here's my guess: there are lots of datacenters being built in Virginia, Pennsylvania, Indiana, Ohio, Illinois [1]. Also in Texas, Georgia, Arizona, Nevada and Utah.

I think the first 5 states have this in common: there are lots of coal burning power plants that were shut down, but can be restarted and hooked to the grid on a relatively short notice. The grid is also quite good in this region.

In Texas, it is likely that new power can be generated with a combination of solar, wind, gas, and fast permitting.

I don't have an explanation for Georgia.

For Arizona, and perhaps Nevada and Utah too, I think it is likely to be solar.

[1] https://www.axios.com/2025/12/18/data-center-growth-map-stat...

aschla|2 months ago

Don't know about the others, but Illinois permanently shut down (and demolished or repurposed the land) the majority of its coal power plants over the past couple decades.

Illinois gets about half its power from nuclear (we have 6 plants and 11 reactors), followed by natural gas at around 20%, and then about equal amounts of coal and wind, at around 10-15%.

So Illinois is actually a pretty decent place to build datacenters, from a clean power generation perspective.

https://www.eia.gov/state/analysis.php?sid=IL

geetee|2 months ago

Title should be "AI labs are raping the planet"

thrance|2 months ago

> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive. xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines.

Wow, "truck-mounted gas turbines"? Who else could have mastered such a futuristic tech in so short a time? Seriously, who wrote this? Grok? And let's ignore that this needless burning of fossil fuel is making life on Earth harder for everyone and everything else.

sam-cop-vimes|2 months ago

I'm no fan of Musk, but you've got to admit it was a clever way to achieve the goal. SemiAnalysis don't do fanboy articles - their research is pretty in-depth. So they are stating it as they see it.

The problem ordinary people all over the world have is that governments are allowing this to happen. Maybe if there were stricter regulation it will prevent players such as Musk to come up with such "innovations".

sameesh|2 months ago

"xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines."

So they solved the power problem by consuming more fossil fuel. Got it.

Symmetry|2 months ago

I found Boom's pivot much less confusing after this article.

Aurornis|2 months ago

Boom’s pivot to trying to build turbines for data centers wasn’t surprising when data center deployments started using turbines. Either their CEO saw one of the headlines or their investors forwarded it over and it became their new talking point.

What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.

torginus|2 months ago

What I didn't get is afair Boom doesn't build engines, aren't they using some old 50s-60s fighter jet engines?

ptx|1 month ago

There's not a single mention of pollution or clean energy or the environment in the entire article. Presumably the regulatory requirements for these generators are less stringent than for proper power plants, so the costs are pushed onto the rest of society (having to deal with the environmental impact) while Microsoft et al. keep the profits?

tehjoker|2 months ago

Isn't spinning up huge amounts of power on inefficient engines going to make climate impacts worse?

AkelaA|2 months ago

I think it's funny that at no point in the article do they mention the idea of simply making LLMs more efficient. I guess that's not important when all you care about is winning the AI "race" rather then selling a long term sustainable product.

redox99|2 months ago

If you make it more efficient, then you train it for longer or make it larger. You're not going to just idle your GPUs.

And yes of course it's a race, everything being equal nobody's going to use your model if someone else has a better model.

cl0ckt0wer|2 months ago

They are already power-constrained. Any efficiency improvements would immediately be allocated to more AI.

inkysigma|2 months ago

What makes you think that the entire process isn't being made more efficient? There are entire papers dedicated to pulling out more FLOPs from GPUs so that less energy is being wasted on simply moving memory around. Of course, there's also inference side optimizations like speculative decoding and MoE. Some of these make the training process more expensive.

The other big problem is that you can always increase the scale to compensate for the energy efficiency. I do wonder if they'll eventually level this off though. If performance somehow plateaus then presumably the efficiency gains will catch up. That being said, that doesn't seem to be a thing in the near future.

deflator|1 month ago

Cool article. It's pretty weird that the writer seems to attribute all decisions by xAI to Mr. Musk personally. I doubt he is closely involved in such technical projects.

zzzeek|2 months ago

> However, AI infrastructure cannot wait for the grid’s multiyear transmission upgrades. An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually. Getting a 400 MW datacenter online even six months earlier is worth billions. Economic need dwarfs problems like an overloaded electric grid. The industry is already searching for new solutions.

wow, that's some logic. Environmentally unsound means of extracting energy directly damage the ecosystem in which humans need to live. The need for a functioning ecosystem "dwarfs" "problems" like billionaires not making enough billions. Fixing a ruined ecosystem would cost many more billions than whatever economic revenue the AI generated while ruining it. So if you're not harnessing the sun or wind (forget about the latter in the US right now, btw), you're burning things, and you can get lost with that.

This kind of short sighted thinking is because when folks like this talk about generating billions of dollars of worth, their cerebellums are firing up as they think of themselves personally as billionaires, corrupting their overall thought processes. We really need to tax billionaires out of existence.

dzonga|2 months ago

the rather uninformed question I had: but answered in comments below

was why not solar ? Yeah Hydrocarbons have no competition if you have to deploy power quickly

1.2GW is a small turbine - compared to the land & battery needed for Solar.

how about Gas ? if you're building in the middle of nowhere ? & there's no gas lines ?

cl0ckt0wer|2 months ago

If you just paid a jillion dollars for a shiny new AI datacenter, would you be ok with just running it during the day?

Apreche|2 months ago

> An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.

Citation needed.

Aurornis|2 months ago

This is coming from a group that does analysis on the semiconductor and cloud industries and provided very expensive access to their models and info. They are the citation.

strange_quark|2 months ago

Even if that’s true, that seems like a putrid number, no?

Assuming a single 1GW the data center runs 24/7 365, it’s consuming 8.76 TwH per year. Only being able to generate $10-$12B in revenue (not profit) per year while consuming as much electricity as the entire state of Hawaii (1.5M people) seems awful.

zozbot234|2 months ago

If you do the math, that's $10-$12 per watt year. There's approx 24×365.25=8766 hours in a year, so assuming that the datacenters would be running 24×7, that boils down to $1.14 to $1.37 in revenue per kWh. That's not a bad deal if power really is a major part of the expense.

mikelitoris|2 months ago

“Could”, sure... and I “could” fly if I strapped a jet engine to my ass

dkobia|2 months ago

Interesting choice of names: "Solar Turbines" - a wholly owned Caterpillar subsidiary that designs and manufactures industrial gas turbines.

That said, it is all pretty impressive.

goda90|2 months ago

Power problem: solved

Natural Gas supply problem: worsened

Carbon in the atmosphere problem: worsened

saghm|2 months ago

Yeah I guess I'm not the target audience for this because I assumed that "the power problem" was "massive increase in electricity costs for people despite virtually unchanged usage on their part", not "AI companies have to wait too long to be able to start using even more power than they already are":

> Nicole Pastore, who has lived in her large stone home near Baltimore’s Johns Hopkins University campus for 18 years, said her utility bills over the past year jumped by 50%. “You look at that and think, ‘Oh my god,’” she said. She has now become the kind of mom who walks around her home turning off lights and unplugging her daughter’s cellphone chargers.

> And because Pastore is a judge who rules on rental disputes in Baltimore City District Court, she regularly sees poor people struggling with their own power bills. “It’s utilities versus rent,” she said. “They want to stay in their home, but they also want to keep their lights on.”

https://www.bloomberg.com/graphics/2025-ai-data-centers-elec...

imglorp|2 months ago

And the air quality around these plants is poor, leading to health problems for the neighbors.

This short term, destructive, thinking should be criminalized.

I think it's time to discuss changing the incentives around ai deployment, specifically paying into a ubi fund whenever human jobs are replaced by ai. Musk himself raised the idea.

https://www.indexbox.io/blog/tech-leaders-push-for-universal...

einrealist|2 months ago

The word 'pollution' appears exactly one time in this entire thing, the word 'community' or 'communities' never.

Aurornis|2 months ago

The natural gas turbines used are relatively efficient as far as engines go. Having them on-site makes transmission losses basically negligible.

Nothing short of full solar connected to batteries produced without any difficult to mine elements will make some people happy, but as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.

protimewaster|2 months ago

Yeah, that headline made me think "Oh good, there's some solution on the horizon that won't require absurd amounts of electricity."

Not so.

corimaith|2 months ago

Coincidentally the USA is more than self sufficient in natural gas and is a net exporter. Drill baby drill!

torginus|2 months ago

And imagine all this poorly located, overpriced, haphazardly thrown together and polluting infrastructure will basically get flushed down the toilet once either the AI bubble pops, or they figure out a new way of doing AI that doesn't require terawatts of power.

seydor|2 months ago

... and, all this for what ?

josefritzishere|2 months ago

TLDR: They're not reducing power consumption, they're just also using gas now. Buckle up for higher prices, the AI slop factory needs more power.

biddit|2 months ago

The dialog around AI resource use is frustratingly inane, because the benefits are never discussed in the same context.

LLMs/diffusers are inefficient from a traditional computing perspective, but they are also the most efficient technology humanity has created:

> AI systems (ChatGPT, BLOOM, DALL-E2, Midjourney) and human individuals performing equivalent writing and illustrating tasks. Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts.

Source: https://www.nature.com/articles/s41598-024-54271-x