> Cover grid infrastructure costs. We will pay for 100% of the grid upgrades needed to interconnect our data centers, paid through increases to our monthly electricity charges. This includes the shares of these costs that would otherwise be passed onto consumers.
This is great, but do they have an actual example of something that would have been passed on to consumers? Or is it just a hypothetical?
In the location I’m familiar with, large infrastructure projects have to pay their own interconnection costs. Utilities are diverse across the country so I wouldn’t be surprised if there are differences, but in general I doubt there are many situations where utilities were going to raise consumer’s monthly rates specifically to connect some large commercial infrastructure.
Maybe someone more familiar with these locations can provide more details, but I think this public promise is rather easy to make.
There's a huge diversity of pricing and regulatory schemes across the US. I think you skepticism is well placed in general, because where I live in California the price increase has been almost entirely from bad grid maintenance policies of years past but people come up with random other excuses.
However there are some examples where increased demand by one sector leads to higher prices for everyone. The PJM electricity market has a capacity market, where generators get compensated for being able to promise the ability to deliver electricity on demand. When demand goes up, prices increase in the capacity market, and those prices get charged to everyone. In the last auction, prices were sky high, which leads to higher electricity prices for everyone:
A lot of electricity markets in other places allow procurement processes where increased costs to meet demand get passed to all consumers equally. If these places were actually using IRPs that had up to date pricing, adding new capacity from renewables and storage would lower prices, but instead many utilities go with what they know, gas generators, which are in short supply and coming in at very high prices.
And the cost of the grid is high everywhere. As renewables and storage drive down electricity generation prices, the grid will come to be a larger and larger percentage of electricity costs. Interconnection is just one bit of the cost, transmission needs to be upgraded all around as overall demand grows. We've gone through a few decades of stagnant to lessening electricity demand, and utilities are hungry to do very expensive grid projects because they get a guaranteed rate of return on grid expansion in most parts of the country.
North Carolina passed Senate Bill 266, changing how utilities can recover costs for projects under construction amid rising energy demand, particularly from data centers. Now Duke Energy wants a double digit price rate increase: https://starw1.ncuc.gov/NCUC/ViewFile.aspx?Id=0ac12377-99be-...
Amazon tried to buy an existing nuclear plant's output from a company called Talen for a datacenters colocated with the nuclear plant. They would do a special deal so the electricity they bought wouldn't go via the shared grid.
It got blocked by FERC as it would raise other consumers' energy prices and the deal wasn't fully transparent (probably intentionally so they could shift costs onto others).
Georgia power already has a demand scaled recovery charge addition to bills that increases prices for residential customers regardless of where the demand originates. It used to be only applied occasionally during the summer. Now they've adjusted the peak / off-peak rates to be what it used to be plus the demand recovery, and now the demand recovery is additional and just applies pretty much all the time.
Generally most distribution costs are socialized starting with the REA and such. My block needed a new transformer a few weeks ago and it will be paid for by every customer of that utility.
"Committing to buying the glass to replace the window I broke in your shop to rob the place, you're welcome."
> Training a single frontier AI model will soon require gigawatts of power, and the US AI sector will need at least 50 gigawatts of capacity over the next several years.
These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
Quite the opposite, really. I did some napkin math for energy and water consumption, and compared to humans these things are very resource efficient.
If LLMs improve productivity by even 5% (studies actually peg productivity gains across various professions at 15 - 30%, and these are from 2024!) the resource savings by accelerating all knowledge workers are significant.
Simplistically, during 8 hours of work a human would consume 10 kWH of electricity + 27 gallons of water. Sped up by 5%, that drops by 0.5kWH and 1.35 gallons. Even assuming a higher end of resources used by LLMs, a 100 large prompts (~1 every 5 minutes) would only consume 0.25 kWH + 0.3 gallons. So we're still saving ~0.25 kWH + 1 gallon overall per day!
That is, humans + LLMs are way more efficient than humans alone. As such, the more knowledge workers adopt LLMs, the more efficiently they can achieve the same work output!
If we assume a conservative 10% productivity speed up, adoption across all ~100M knowledge work in the US will recoup the resource cost of a full training run in a few business days, even after accounting for the inference costs!
Additional reading with more useful numbers (independent of my napkin math):
> "Committing to buying the glass to replace the window I broke in your shop to rob the place, you're welcome."
Buying electricity isn't inherently destructive. That's a very bad analogy.
> These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
I'm not arguing that they are efficient right now, but how would you measure that? What kind of output does it have to make per kWh of input to be acceptable? Keep in mind that the baseline of US power use is around 500GW and that currently AI is maybe 10.
Adding new electricity demand to the grid should not be viewed as breaking windows and robbing others. When I bought an EV, I increased my electricity demand a huge amount, but it's not like I'm stealing from my neighbors. No rules were broken. We just need to make sure that I pay enough for my additional demand.
> AI sector will need at least 50 gigawatts of capacity over the next several years.
The error bars on this prediction are extremely large. It would represent a 5% increase in capacity in "the next several years" which is only a percent or two per year, but it could also only be 5GW over the next several years. 50GW represents about 1 year of actual grid additions.
> All of you building these things for these people should be embarrassed and ashamed.
I'm not building these things, and I think there should be AI critique, but this is far over the top. There's great value for all of humanity in these tools. The actual energy use of a typical user is not much more than a typical home appliance, because so many requests are batched together and processed in parallel.
We should be ashamed of getting into our cars every day, that's a true harm to the environment. We should have built something better, allowed more transit. A daily commute of 30 miles is disastrous for the environment compared other any AI use that's really possible at the moment.
Let's be cautious of AI but keep our critiques grounded in reality, so that we have enough powder left to fight the rest of things we need to change in society.
We've moved past asking where the energy comes from or how our planet will survive this critical phase.
These days, it's about framing - every country is scrambling to up their game just to stay in power. The companies that are riding this wave are spending millions in marketing, lobbying and billions on consuming energy so that they can make trillions in valuation.
I am also an ardent user of AI - but sometimes I do feel guilty when I use so many tokens - because I know I am burning energy, and feeding part of this mission. If there is a solution, I would like to be a part of it.
> I am also an ardent user of AI - but sometimes I do feel guilty when I use so many tokens - because I know I am burning energy, and feeding part of this mission. If there is a solution, I would like to be a part of it.
This is by far the best article I've seen on it [0]. Which leads me to conclude: if you use coding agents, then yes, it's definitely a concern. Yet if you drive daily, even an EV, it's very small compared to that. Let alone flying. Personally, even if my "AI emissions" are at 10x his estimated usage (they almost certainly aren't), the other sacrifices I make to reduce emissions have such an impact that I'd still be multiple times below the national average.
Note how the above measures energy usage (kWh), not emissions. For anyone taking fossil fuel transit regularly, whether ICE car/taxi/airplane, AI usage is all but guaranteed to be meaningless compared to their transport emissions. One hamburger is at least 5x more emissions than his "median say with Claude Code", so there's another one. If you're feeling guilty, track how much beef you're eating, cut it down by 20% and use agents to your hearts content.
Now of course, a different form of AI usage like image generation and especially video generation is incomparably more energy-intensive per query. We'd need separate math on that.
> sometimes I do feel guilty when I use so many tokens
There's nothing particularly worse about money spent on AI vs. anything else. I don't feel guilty for having 6 shirts even though I can only wear one at a time.
> how our planet will survive this critical phase.
> trillions in valuation.
This is more or less literally the "yes we destroyed the planet, but for a brief moment we created trillions in shareholder value" meme. Perhaps we need to take a step back and ask to what extent this benefits humans as humans, not as economic units. Especially given the explicit threat in the AI marketing material to destroy all creative industries and replace human fulfilment and even connection with AI.
There are people who recognize there is a problem and would support collective action to fix it, and there are those that don't. As long as you are in the first group, there's nothing else you can individually do to make a difference.
Rather have the government tax these entities (great way to have the public support a VAT in this instance) than rely on their "benefactors" that have shown zero remorse in the societal destruction against the planet and humanity, but okay.
Utilities do charge infrastructure projects for their interconnection costs. Maybe there was some hypothetical situation where some costs would have gone into a general budget, but utilities aren’t usually in the habit of doing large interconnection projects for free and sending the bills to consumers.
One of the potential upsides of AI in the USA is we'll bring down electrical prices compared to something like China. Power has to be abundantly plentiful and concentrated.
Maybe then, we could afford to smelt an ingot of aluminum in the USA.
Until then, I guess we're just sadly just burning coal to create cat memes. I hope Anthropic can lead the charge. Crypto was already a massive setback in terms of clean power, AI is already very dirty.
How will that happen? China is building more generation as we speak. And I mean a lot. The gap is widening and the rate of change is even worse, thanks to "clean beautiful coal" or whatever Trump said.
Electricity price is a weird beast. Everyone has to pay the price of the most expensive electricity source (generally gas plants) that was recruited to respond to the power demand. It means that during a spike the electricity price can double or triple.
What I infer from Anthropic post is that they will estimate the energy price as if they weren't using it and pay the difference if their use upped the price.
Gas plants are only the most expensive in the simple cycle configuration. Combined cycle plants are competitive with other forms of baseload generation. The trouble is the response time.
With day ahead forecasting, we can try to turn that peak load into base load. Grid operations are a non trivial part in how this AI energy situation plays out.
I see this as another OPEX expenditure that has to be factored into Anthropic’s (hypothetical) profitability, and am intrigued as to what this means in an industry that is becoming rife with CAPEX sinks…
This is all good and well wishes as long as investors are willing to pour money into the bubble. When the music stops is where we will see the true colors. Corporations are optimized to make money, governments should be optimized to protect people.
I keep saying it, we need to ditch all the "old" styles of power generation YES ncluding green ones like Solar etc. We should all be pushing for SMRs!!!
> Cover grid infrastructure costs. We will pay for 100% of the grid upgrades needed to interconnect our data centers, paid through increases to our monthly electricity charges.
How does paying more monthly cover an infrastructure build out that requires up front capital?
Aurornis|19 days ago
This is great, but do they have an actual example of something that would have been passed on to consumers? Or is it just a hypothetical?
In the location I’m familiar with, large infrastructure projects have to pay their own interconnection costs. Utilities are diverse across the country so I wouldn’t be surprised if there are differences, but in general I doubt there are many situations where utilities were going to raise consumer’s monthly rates specifically to connect some large commercial infrastructure.
Maybe someone more familiar with these locations can provide more details, but I think this public promise is rather easy to make.
epistasis|19 days ago
However there are some examples where increased demand by one sector leads to higher prices for everyone. The PJM electricity market has a capacity market, where generators get compensated for being able to promise the ability to deliver electricity on demand. When demand goes up, prices increase in the capacity market, and those prices get charged to everyone. In the last auction, prices were sky high, which leads to higher electricity prices for everyone:
https://www.utilitydive.com/news/pjm-interconnection-capacit...
A lot of electricity markets in other places allow procurement processes where increased costs to meet demand get passed to all consumers equally. If these places were actually using IRPs that had up to date pricing, adding new capacity from renewables and storage would lower prices, but instead many utilities go with what they know, gas generators, which are in short supply and coming in at very high prices.
And the cost of the grid is high everywhere. As renewables and storage drive down electricity generation prices, the grid will come to be a larger and larger percentage of electricity costs. Interconnection is just one bit of the cost, transmission needs to be upgraded all around as overall demand grows. We've gone through a few decades of stagnant to lessening electricity demand, and utilities are hungry to do very expensive grid projects because they get a guaranteed rate of return on grid expansion in most parts of the country.
mysterydip|19 days ago
wmf|19 days ago
ZeroGravitas|18 days ago
It got blocked by FERC as it would raise other consumers' energy prices and the deal wasn't fully transparent (probably intentionally so they could shift costs onto others).
hattmall|19 days ago
pvab3|19 days ago
unltdpower|19 days ago
> Training a single frontier AI model will soon require gigawatts of power, and the US AI sector will need at least 50 gigawatts of capacity over the next several years.
These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
keeda|19 days ago
Quite the opposite, really. I did some napkin math for energy and water consumption, and compared to humans these things are very resource efficient.
If LLMs improve productivity by even 5% (studies actually peg productivity gains across various professions at 15 - 30%, and these are from 2024!) the resource savings by accelerating all knowledge workers are significant.
Simplistically, during 8 hours of work a human would consume 10 kWH of electricity + 27 gallons of water. Sped up by 5%, that drops by 0.5kWH and 1.35 gallons. Even assuming a higher end of resources used by LLMs, a 100 large prompts (~1 every 5 minutes) would only consume 0.25 kWH + 0.3 gallons. So we're still saving ~0.25 kWH + 1 gallon overall per day!
That is, humans + LLMs are way more efficient than humans alone. As such, the more knowledge workers adopt LLMs, the more efficiently they can achieve the same work output!
If we assume a conservative 10% productivity speed up, adoption across all ~100M knowledge work in the US will recoup the resource cost of a full training run in a few business days, even after accounting for the inference costs!
Additional reading with more useful numbers (independent of my napkin math):
https://www.nature.com/articles/s41598-024-76682-6
https://cacm.acm.org/blogcacm/the-energy-footprint-of-humans...
Dylan16807|19 days ago
Buying electricity isn't inherently destructive. That's a very bad analogy.
> These things are so hideously inefficient. All of you building these things for these people should be embarrassed and ashamed.
I'm not arguing that they are efficient right now, but how would you measure that? What kind of output does it have to make per kWh of input to be acceptable? Keep in mind that the baseline of US power use is around 500GW and that currently AI is maybe 10.
epistasis|19 days ago
> AI sector will need at least 50 gigawatts of capacity over the next several years.
The error bars on this prediction are extremely large. It would represent a 5% increase in capacity in "the next several years" which is only a percent or two per year, but it could also only be 5GW over the next several years. 50GW represents about 1 year of actual grid additions.
> All of you building these things for these people should be embarrassed and ashamed.
I'm not building these things, and I think there should be AI critique, but this is far over the top. There's great value for all of humanity in these tools. The actual energy use of a typical user is not much more than a typical home appliance, because so many requests are batched together and processed in parallel.
We should be ashamed of getting into our cars every day, that's a true harm to the environment. We should have built something better, allowed more transit. A daily commute of 30 miles is disastrous for the environment compared other any AI use that's really possible at the moment.
Let's be cautious of AI but keep our critiques grounded in reality, so that we have enough powder left to fight the rest of things we need to change in society.
esafak|19 days ago
dudisubekti|18 days ago
You should start from beef industry.
Andys|19 days ago
measurablefunc|19 days ago
venk12|18 days ago
These days, it's about framing - every country is scrambling to up their game just to stay in power. The companies that are riding this wave are spending millions in marketing, lobbying and billions on consuming energy so that they can make trillions in valuation.
I am also an ardent user of AI - but sometimes I do feel guilty when I use so many tokens - because I know I am burning energy, and feeding part of this mission. If there is a solution, I would like to be a part of it.
deaux|18 days ago
This is by far the best article I've seen on it [0]. Which leads me to conclude: if you use coding agents, then yes, it's definitely a concern. Yet if you drive daily, even an EV, it's very small compared to that. Let alone flying. Personally, even if my "AI emissions" are at 10x his estimated usage (they almost certainly aren't), the other sacrifices I make to reduce emissions have such an impact that I'd still be multiple times below the national average.
Note how the above measures energy usage (kWh), not emissions. For anyone taking fossil fuel transit regularly, whether ICE car/taxi/airplane, AI usage is all but guaranteed to be meaningless compared to their transport emissions. One hamburger is at least 5x more emissions than his "median say with Claude Code", so there's another one. If you're feeling guilty, track how much beef you're eating, cut it down by 20% and use agents to your hearts content.
Now of course, a different form of AI usage like image generation and especially video generation is incomparably more energy-intensive per query. We'd need separate math on that.
[0] https://www.simonpcouch.com/blog/2026-01-20-cc-impact/
xnx|18 days ago
There's nothing particularly worse about money spent on AI vs. anything else. I don't feel guilty for having 6 shirts even though I can only wear one at a time.
pjc50|18 days ago
> trillions in valuation.
This is more or less literally the "yes we destroyed the planet, but for a brief moment we created trillions in shareholder value" meme. Perhaps we need to take a step back and ask to what extent this benefits humans as humans, not as economic units. Especially given the explicit threat in the AI marketing material to destroy all creative industries and replace human fulfilment and even connection with AI.
rkuykendall-com|18 days ago
shimman|19 days ago
Aurornis|19 days ago
exabrial|19 days ago
Maybe then, we could afford to smelt an ingot of aluminum in the USA.
Until then, I guess we're just sadly just burning coal to create cat memes. I hope Anthropic can lead the charge. Crypto was already a massive setback in terms of clean power, AI is already very dirty.
actionfromafar|18 days ago
cranium|18 days ago
What I infer from Anthropic post is that they will estimate the energy price as if they weren't using it and pay the difference if their use upped the price.
bob1029|18 days ago
With day ahead forecasting, we can try to turn that peak load into base load. Grid operations are a non trivial part in how this AI energy situation plays out.
Faaak|18 days ago
DoctorOetker|18 days ago
j45|19 days ago
rcarmo|18 days ago
4ndrewl|18 days ago
kevin061|18 days ago
"We will cover the cost of upgrading the electricity grid so we can use more energy" yeah. Of course you will. What?
Sytten|19 days ago
bvan|18 days ago
DoctorOetker|18 days ago
Go to the bathroom upstairs, don't use our ecosystem as your latrines if you can direct it straight to the CMB.
delaminator|18 days ago
We lived without aluminum soda cans for 100k years
DoctorOetker|18 days ago
Oh no the burden of actually explaining why you want to de-emphasize a comment.
GregDavidson|18 days ago
daveshappy|18 days ago
ed-209|19 days ago
See, the AI is gonna create jobs, not eliminate them lol. Now let us strip mine your hood G.
FpUser|19 days ago
zombot|18 days ago
bix6|19 days ago
How does paying more monthly cover an infrastructure build out that requires up front capital?
wmf|19 days ago
senectus1|19 days ago