For 20 million users? Great, you just proved that AI energy impact is not big.
Like a fraction of 20 million users over-using AC like it happens, for instance, in the US. Or flying for work continuously without a good reason, also super common, and incredibly more wasteful (~30h flight time of a 737 is like the energy needed for a training, not inference, of a model with 100s of billions parameters).
The point about AI and energy does not make sense, it exists only because there are people that are worried about AI, and need to find something to show it creates damage.
You can install Flux in your fucking laptop and generate images there, and you'll see that to make it as hot (or discharge as quickly) as when you play any modern videogame, it takes a lot of efforts and generated images. And you are running on a battery...
Want to talk about all these stupid Javascript frameworks, that make loading a trivial page so wasteful, energy-wise? Almost everything is worse than AI, basically. The fact that this post is upvoted here on HN, where once there was more critical thinking, is testament on how bad nowadays we are doing.
To add some extra numbers here just to showcase how little energy usage this is.
This means it's adding about 0.012% additional energy consumption to those users energy consumption.
From another angle: Average US house energy consumption is around 30kWh per day. 0.012% of that is 3.75 watt hours of energy per day. This is the equivalent amount of energy as streaming HD video to your iPhone on a 4G network for 1.5 seconds. [0]
So in other words, a 15s youtube ad you are forced to watch on your phone before watching the video you were going to watch anyway takes an order of magnitude more energy than the average AI user according to this article.
Is it not still better to just avoid using all this energy though? Do we really need to have these LLM tools everywhere?
And more importantly, when do we stop justifying the use of one finite resource by pointing to all the other ways we could reduce use but don't? Just because people waste a bunch of energy, water, wood, etc doesn't justify using it however you want, at least in my opinion.
I am apt to agree but I reject this whole argument completely. This is just part of a de-growth agenda where every action is looked at through the lens of scarcity. Sometimes its environment, other times its equity, but it always leads to extinguishing human flourishing.
Examples:
- Having babies is bad because it increases the carbon footprint
- Having babies is bad because world is over-populated
- Giving advantages to your children is bad because other's aren't as advantaged
- Why invest in rockets when people are starving in [country]
- Honors classes are bad because the distribution of the participants is not proportional to the population
It's all just so exhausting and at this point I refuse to engage with it at any level. There is no winning.
Try putting it in a different context because I think you are weighing both “AI” and “household operation” the same. For the use case you mentioned (image generation) what do you think has a bigger impact on someone’s quality of life: 1) operating a home or 2) generating AI images?
I would argue most people would weigh the convenience of a house a lot higher. That doesn’t mean AI is a net negative, just that it’s important to frame the problem that focuses on what matters most to people (quality of life).
This article only discusses image generation, a small subset of AI use cases. As users, governments, corporations, and background processes grow to consume AI all day the water and power issues will grow the same. Whether AI or JavaScript, action is needed ahead of shareholder value.
Your numbers aside, how do you explain the prevailing "we need to build more power plants for AI" sentiment in some circles? Surely that indicates a significant carbon footprint.
(Especially when you consider how many are looking to coal and gas.)
Also, there's immense variation in the carbon intensity of electricity generation - an order of magnitude difference between Australia and France. How much electricity we use matters far less than how we're making that electricity.
Honest question. I remeber some recent news about Microsoft considering to bring back in use an "old" nuclear power plant to cover their (maybe projected?) cloud power consuption due to AI. If AI's power consumption is negligible and less than a short YouTube ad anyway, why the need to bring back to operation a nuclear power plant?
> For 20 million users? Great, you just proved that AI energy impact is not big.
You missed that this was just a calculation of Midjourney power use, and you missed the next paragraph:
"Keep in mind that our example only touches image generation services. Let's not forget about other AI services like ChatGPT, Gemini, Claude, Deepseek—and all their versions and variations. The list, and energy waste, continues to grow."
> Almost everything is worse than AI, basically.
I am glad you brough this up. AI just adds to it all.
I 100% gaurantee you don't recycle. Perfect, enemy the good. That said, couldn't agree moreabout air travel or travel in general; a monument to man's self endulgent waste.
Doing calculations like these is a really bad idea because they muddy the waters, putting the focus on disputable numbers - unless you're willing to do a proper study and really dive into it. It's extremely tiring seeing all these well-meaning people, including possibly this author (unless they're writing this for clout), making this mistake time and time again. You're actively not helping.
There's a much better heuristic. Just look at the contracts that the model providers (MS, Google et al) have recently, suddenly established to buy lots of electricity, spin up nuclear power plants and so on. That's the giveaway and one that doesn't need any calculations. Plus the hundreds of billions being invested in datacentres purely for these models.
You see it in this very comment section, lots of naysayers purely based off the provided, utterly meaningless numbers, when the above facts say it all and renders everything else moot. If the training and inference wouldn't cost insane amounts of energy, they wouldn't be hastily taking up these contracts and investing such obscene amounts in datacentres. That's all you need to know.
We also need to consider that AI is being used as a replacement for things that use a lot less energy. Like, someone "asking ChatGPT" for a simple question is using a lot more energy then using a traditional search engine. This gets worse when AI is shoehorned into random stuff, thus making that stuff less efficient.
Completely agree. There are so many additional layers of electrical costs (and other costs) not factored in
The build/infrastructure costs. Especially the early models before the A100 and datacenters jumped on the bandwagon.
Training costs of the model being used
Training costs of failed models and tests
Training costs of previous models that have been superseded before their cost could be recaptured from users.
The actual energy used by AI is likely orders of magnitude higher than estimated here, but without really justifying all the numbers used and sourcing them its going to be a never ending argument between the pro and anti-AI crowed with the realists stuck in between trying to read the data.
In the example you compare something with 20 million daily users to the energy usage of 25.000 homes. I feel like comparisons like this mostly work (and feel scary) because we don't really grasp how much bigger 20 million really is. It feels a bit like scaremongering to me (without providing context of AI energy use vs other online activities/services).
This is a great point regarding what we ought to consider when adapting our lifestyle to reduce negative environmental impact:
> In deciding what to cut, we need to factor in both how much an activity is emitting and how useful and beneficial the activity is to our lives.
Although I would extend “our lives” to “society”. His own example with a hospital emitting more than a cruise ship is a good illustration of this; and as a more absurd example it would drastically cut the emissions if we remove all humans and replace them by LLMs (which sort of defeats the entire point, obviously, because LLMs are no longer needed).
Continuing this line of thought, when considering your use of an LLM, you ought to weigh not merely its emissions and water usage, but also the larger picture as to how it benefits the human society.
For example: Is it based on ethically sound approaches? (If it is more like “ends justify the means”, do we even know what those ends are?) What are its the foreseeable long-term effects on human flourishing? Will it (unless regulated) cause a detriment to livelihoods of the many people while increasing the wealth gap with the tech elites? Does it negatively impact open information sharing (willingness to run self-hosted original content websites or communities open to public, or even the feasibility of doing so[0][1]), motivation and capability to learn, creativity? And so forth.
Is it energy waste if it does something productive? For me, energy waste is energy which, if not used, wouldn't have any noticeable impact.
For instance, most city lights are on even when nobody needs them (using smart activators would allow us not to waste energy.
Energy truly wasted on manufacturing items which are never used, that's wasted.
But generating images, text and code to allow people to make decisions, iterate on ideas, etc. That's not a waste in my opinion, that's actually using energy in one of the most interesting parts of our lives: being creative.
On my MBP M2 Max 16" with 92GB of RAM, 2 prompts on llama3.3:70b or 1 prompt on deepseek-r1:70b take about 10% of battery. Really makes me nervous when I think about people (ab)using gpt-o1 or 4o for everything.
For a prompt to make the cut, I usually judge whether the task (or the documentation) is boring enough, time-consuming enough, yet precise enough for an llm to give me the solution. If something requires more than three-five prompts then I need to work on it myself and come back with smaller and more precise questions.
IMO, if the green energy industry wants to penetrate an important domain of modern life fast, the serving of open source models seems like a really low hanging fruit.
If something uses electricity then the footprint is mostly about the source of that electricity.
Solar, Wind and nuclear are magnitudes cleaner than fossil fuels.
So, the correct answer is to clean up the grid. Which is relatively easy and cheap and impacts many fields outside AI, further speeding us down the cost curve to cheaper cleaner power.
And once that is happening, you can start moving things that don't even use electricity at the moment like transport, heating and industrial processes to electrical usage. That task, which we should do because it's cheaper and cleaner and will prevent catastrophic climate change, will require advanced nations to double their electricity usage so data center usage is not really moving the needle.
You can also build the datacenters somewhere where there is plenty of carbon free energy available. Google has just recently bought two plots in Northern Finland, near some serious grid connections, and they're not the only one with plans.
Finland can build and has been building a lot of wind power lately since it's flat and quite sparsely populated but with relatively good infrastructure. There are lot of projects in the planning pipelines just waiting for demand. Electricity prices are among the lowest in Europe already.
At the same time, let's also not rush to hand wave away issues just because "others are likely worse".
Those industries also are an issue, and the question is also applicable there. In fact, the impact of tourism on the environment and how to reduce that impact is one that is on going.
In fact, exactly because it is an emerging technology now is a good time to be conscious about environmental impact. Because it is easier to address issues now than later down the line when it is no longer emerging and already ingrained.
Aggregating load is useful to get a sense of overall scale, but per use costs are probably a more realistic proxy. One query is ~2Wh. At average retail pricing (17.5c per kWh) that's like... 0.035c per query. You're at like 30 queries before reaching a second cent of electricity.
A monitor is like ~10-20W. So one query is like having your monitor on for an extra 12 minutes a day.
At this scale, there's going to be no consumer side incentive to conserve.
The author seems to assume that each daily user is running a new prompt every hour. I suspect the actual median number of images per DAU is lower than that. A user doing 20Wh roughly corresponds to heating a single meal in the microwave, it probably is not a leading cause of their carbon footprint.
If, in order to offset the cost of my AI use, I had to reduce the thermostat by 1/10 of a degree I'd take that trade. But I'm pretty sure reducing my carbon footprint much more than stopping AI use.
Gov should fine/tax emissions. If the emissions are too high then tax it more. End of story.
Does it really matter who is doing it and for what purpose?
This reminds me of the internet going crazy over Taylor Swift flying everywhere with her private jets.
It just smells like trying to take blame away from the gov and putting it on private people/companies who are just following the law.
As long as you’re paying a fine proportional to the damage you’re causing, I don’t care whether it’s for AI or crypto or just pumping carbon for the sake of it.
"Can we calculate the energy cost of AI models available as services online? The truth is-we can't, not without access to their datacenters."
If "AI" energy use is nothing to be concerned about, as alleged by the top comment, then the "AI" companies might benefit from releasing data showing the (low) energy and water consumption. As it happens, none of them will release the data public wants about energy consumption. It is believed this withholding of data is intentional.
Can't find where the A100 figure comes from, but an RTX 4090 can generate more than an image a second[0], which assuming constant power usage of 450W would be 0.000125kWh - compared to, say, 2.4kWh for a pot of coffee[1].
If you're running it as a service, you'd likely optimize the model with TensorRT and run on hardware intended for inference, instead of consumer/training GPUs, so I don't think either of these figures can be extrapolated to Midjourney.
Don’t forget the US administrations plan to, “win the AI race,” which includes building a ridiculous number of new data centres and methane power plants… and to drill, baby, drill.
AI tech is quickly becoming a concerning contributor to the mix of emissions driving climate change.
I realize that in some future state someone, somewhere will figure out how to make hardware that draws less power and requires less cooling. Some researcher at some point will find a more efficient algorithm. And all of it will run off of green energy. I don’t think we have time to mess around and find out if that will happen.
We’ve recently seen years about 1.5C and if we continue on this course it’s quite likely that will become the average.
AI might play a significant role in contributing to the crisis. And once VC and government subsidies run out, if these companies have to pay for the externalities, will it still be profitable?
This doesn't account for the server they are hosted in, their CPU loads, losses in the PSU, which will inflate those numbers higher.
Then there is the cooling, which needs to remove every watt consumed by the GPU/servers, as its all outputs as heat (multiplied by the PSU losses).
Things are amplified further by the inefficiencies of the cooling process, including cooling the HVAC hot side (and the water consumption of the cooling towers), and building insulation.
Then the article only accounts for Midjourney, which is only one of the operators in this space. All the major tech companies have huge fleets of server working on this.
[+] [-] antirez|1 year ago|reply
For 20 million users? Great, you just proved that AI energy impact is not big. Like a fraction of 20 million users over-using AC like it happens, for instance, in the US. Or flying for work continuously without a good reason, also super common, and incredibly more wasteful (~30h flight time of a 737 is like the energy needed for a training, not inference, of a model with 100s of billions parameters).
The point about AI and energy does not make sense, it exists only because there are people that are worried about AI, and need to find something to show it creates damage.
You can install Flux in your fucking laptop and generate images there, and you'll see that to make it as hot (or discharge as quickly) as when you play any modern videogame, it takes a lot of efforts and generated images. And you are running on a battery...
Want to talk about all these stupid Javascript frameworks, that make loading a trivial page so wasteful, energy-wise? Almost everything is worse than AI, basically. The fact that this post is upvoted here on HN, where once there was more critical thinking, is testament on how bad nowadays we are doing.
[+] [-] ceh123|1 year ago|reply
This means it's adding about 0.012% additional energy consumption to those users energy consumption.
From another angle: Average US house energy consumption is around 30kWh per day. 0.012% of that is 3.75 watt hours of energy per day. This is the equivalent amount of energy as streaming HD video to your iPhone on a 4G network for 1.5 seconds. [0]
So in other words, a 15s youtube ad you are forced to watch on your phone before watching the video you were going to watch anyway takes an order of magnitude more energy than the average AI user according to this article.
[0] https://www.statista.com/statistics/1109623/electricity-cons...
[+] [-] _heimdall|1 year ago|reply
And more importantly, when do we stop justifying the use of one finite resource by pointing to all the other ways we could reduce use but don't? Just because people waste a bunch of energy, water, wood, etc doesn't justify using it however you want, at least in my opinion.
[+] [-] bko|1 year ago|reply
I am apt to agree but I reject this whole argument completely. This is just part of a de-growth agenda where every action is looked at through the lens of scarcity. Sometimes its environment, other times its equity, but it always leads to extinguishing human flourishing.
Examples:
- Having babies is bad because it increases the carbon footprint
- Having babies is bad because world is over-populated
- Giving advantages to your children is bad because other's aren't as advantaged
- Why invest in rockets when people are starving in [country]
- Honors classes are bad because the distribution of the participants is not proportional to the population
It's all just so exhausting and at this point I refuse to engage with it at any level. There is no winning.
[+] [-] bumby|1 year ago|reply
I would argue most people would weigh the convenience of a house a lot higher. That doesn’t mean AI is a net negative, just that it’s important to frame the problem that focuses on what matters most to people (quality of life).
[+] [-] qwerty_clicks|1 year ago|reply
[+] [-] phyzome|1 year ago|reply
(Especially when you consider how many are looking to coal and gas.)
[+] [-] jdietrich|1 year ago|reply
https://ourworldindata.org/grapher/carbon-intensity-electric...
[+] [-] GTP|1 year ago|reply
[+] [-] amelius|1 year ago|reply
> For 20 million users?
But during how many hours are these users active?
[+] [-] FollowingTheDao|1 year ago|reply
You missed that this was just a calculation of Midjourney power use, and you missed the next paragraph:
"Keep in mind that our example only touches image generation services. Let's not forget about other AI services like ChatGPT, Gemini, Claude, Deepseek—and all their versions and variations. The list, and energy waste, continues to grow."
> Almost everything is worse than AI, basically.
I am glad you brough this up. AI just adds to it all.
[+] [-] anal_reactor|1 year ago|reply
So basically, if we want to help the environment, we should replace people with AI
[+] [-] greentxt|1 year ago|reply
[+] [-] zwolf24|1 year ago|reply
[deleted]
[+] [-] qrsjutsu|1 year ago|reply
[deleted]
[+] [-] maeil|1 year ago|reply
There's a much better heuristic. Just look at the contracts that the model providers (MS, Google et al) have recently, suddenly established to buy lots of electricity, spin up nuclear power plants and so on. That's the giveaway and one that doesn't need any calculations. Plus the hundreds of billions being invested in datacentres purely for these models.
You see it in this very comment section, lots of naysayers purely based off the provided, utterly meaningless numbers, when the above facts say it all and renders everything else moot. If the training and inference wouldn't cost insane amounts of energy, they wouldn't be hastily taking up these contracts and investing such obscene amounts in datacentres. That's all you need to know.
[+] [-] ARandumGuy|1 year ago|reply
[+] [-] hmmm-i-wonder|1 year ago|reply
The build/infrastructure costs. Especially the early models before the A100 and datacenters jumped on the bandwagon.
Training costs of the model being used
Training costs of failed models and tests
Training costs of previous models that have been superseded before their cost could be recaptured from users.
The actual energy used by AI is likely orders of magnitude higher than estimated here, but without really justifying all the numbers used and sourcing them its going to be a never ending argument between the pro and anti-AI crowed with the realists stuck in between trying to read the data.
[+] [-] causal|1 year ago|reply
Energy use is worth discussing, but this is worse than just guesswork.
[+] [-] datadrivenangel|1 year ago|reply
[+] [-] michielr|1 year ago|reply
Anyhow, another perspective is given here: https://andymasley.substack.com/p/individual-ai-use-is-not-b... This piece aligns more with my view on AI and its energy usage.
[+] [-] strogonoff|1 year ago|reply
> In deciding what to cut, we need to factor in both how much an activity is emitting and how useful and beneficial the activity is to our lives.
Although I would extend “our lives” to “society”. His own example with a hospital emitting more than a cruise ship is a good illustration of this; and as a more absurd example it would drastically cut the emissions if we remove all humans and replace them by LLMs (which sort of defeats the entire point, obviously, because LLMs are no longer needed).
Continuing this line of thought, when considering your use of an LLM, you ought to weigh not merely its emissions and water usage, but also the larger picture as to how it benefits the human society.
For example: Is it based on ethically sound approaches? (If it is more like “ends justify the means”, do we even know what those ends are?) What are its the foreseeable long-term effects on human flourishing? Will it (unless regulated) cause a detriment to livelihoods of the many people while increasing the wealth gap with the tech elites? Does it negatively impact open information sharing (willingness to run self-hosted original content websites or communities open to public, or even the feasibility of doing so[0][1]), motivation and capability to learn, creativity? And so forth.
[0] https://news.ycombinator.com/item?id=42486481
[1] https://news.ycombinator.com/item?id=42549624
[+] [-] xandrius|1 year ago|reply
For instance, most city lights are on even when nobody needs them (using smart activators would allow us not to waste energy.
Energy truly wasted on manufacturing items which are never used, that's wasted.
But generating images, text and code to allow people to make decisions, iterate on ideas, etc. That's not a waste in my opinion, that's actually using energy in one of the most interesting parts of our lives: being creative.
[+] [-] fuzzc0re|1 year ago|reply
For a prompt to make the cut, I usually judge whether the task (or the documentation) is boring enough, time-consuming enough, yet precise enough for an llm to give me the solution. If something requires more than three-five prompts then I need to work on it myself and come back with smaller and more precise questions.
IMO, if the green energy industry wants to penetrate an important domain of modern life fast, the serving of open source models seems like a really low hanging fruit.
[+] [-] Lionga|1 year ago|reply
[+] [-] ZeroGravitas|1 year ago|reply
Solar, Wind and nuclear are magnitudes cleaner than fossil fuels.
So, the correct answer is to clean up the grid. Which is relatively easy and cheap and impacts many fields outside AI, further speeding us down the cost curve to cheaper cleaner power.
And once that is happening, you can start moving things that don't even use electricity at the moment like transport, heating and industrial processes to electrical usage. That task, which we should do because it's cheaper and cleaner and will prevent catastrophic climate change, will require advanced nations to double their electricity usage so data center usage is not really moving the needle.
[+] [-] anttisalmela|1 year ago|reply
Finland can build and has been building a lot of wind power lately since it's flat and quite sparsely populated but with relatively good infrastructure. There are lot of projects in the planning pipelines just waiting for demand. Electricity prices are among the lowest in Europe already.
[+] [-] nthingtohide|1 year ago|reply
[+] [-] creesch|1 year ago|reply
Those industries also are an issue, and the question is also applicable there. In fact, the impact of tourism on the environment and how to reduce that impact is one that is on going.
In fact, exactly because it is an emerging technology now is a good time to be conscious about environmental impact. Because it is easier to address issues now than later down the line when it is no longer emerging and already ingrained.
[+] [-] hambes|1 year ago|reply
[+] [-] BSDobelix|1 year ago|reply
[+] [-] drawkward|1 year ago|reply
[+] [-] icegreentea2|1 year ago|reply
A monitor is like ~10-20W. So one query is like having your monitor on for an extra 12 minutes a day.
At this scale, there's going to be no consumer side incentive to conserve.
[+] [-] nxpnsv|1 year ago|reply
[+] [-] oscarmoxon|1 year ago|reply
[+] [-] wiether|1 year ago|reply
They say :
> As of January 2025, Midjourney has nearly 20 million daily active users.
No, its the total number of users on the Discord server. Completely different than "daily active".
They say :
> assuming each user executes one prompt per hour in average
To be more realistic (most humans sleeping for a few hours each day...) it should be 24 prompts per day on average.
Sure, some users can make a lot of queries each day, but I don't see how a regular user would need 96 images each day.
So those figures makes no sense since they are based on wrong ones in the first place, and also completely irrealistic asumptions.
[+] [-] beastman82|1 year ago|reply
[+] [-] chasd00|1 year ago|reply
[+] [-] Symmetry|1 year ago|reply
[+] [-] brap|1 year ago|reply
Gov should fine/tax emissions. If the emissions are too high then tax it more. End of story.
Does it really matter who is doing it and for what purpose?
This reminds me of the internet going crazy over Taylor Swift flying everywhere with her private jets.
It just smells like trying to take blame away from the gov and putting it on private people/companies who are just following the law.
As long as you’re paying a fine proportional to the damage you’re causing, I don’t care whether it’s for AI or crypto or just pumping carbon for the sake of it.
Emissions too high? Increase the fine.
[+] [-] 1vuio0pswjnm7|1 year ago|reply
If "AI" energy use is nothing to be concerned about, as alleged by the top comment, then the "AI" companies might benefit from releasing data showing the (low) energy and water consumption. As it happens, none of them will release the data public wants about energy consumption. It is believed this withholding of data is intentional.
https://www.nature.com/articles/d41586-024-00478-x
https://www.thegazette.com/business/artificial-intelligence-...
https://www.forbes.com/sites/geruiwang/2025/01/24/stargates-...
https://www.bizjournals.com/pittsburgh/inno/stories/news/202...
https://www.unep.org/news-and-stories/story/ai-has-environme...
https://news.mit.edu/2025/explained-generative-ai-environmen...
[+] [-] Ukv|1 year ago|reply
If you're running it as a service, you'd likely optimize the model with TensorRT and run on hardware intended for inference, instead of consumer/training GPUs, so I don't think either of these figures can be extrapolated to Midjourney.
[0]: https://cdn.mos.cms.futurecdn.net/RtAnnCQxaVJNYgA4LbBhuJ-970... [1]: https://electricityplans.com/how-much-electricity-does-a-cof...
[+] [-] agentultra|1 year ago|reply
AI tech is quickly becoming a concerning contributor to the mix of emissions driving climate change.
I realize that in some future state someone, somewhere will figure out how to make hardware that draws less power and requires less cooling. Some researcher at some point will find a more efficient algorithm. And all of it will run off of green energy. I don’t think we have time to mess around and find out if that will happen.
We’ve recently seen years about 1.5C and if we continue on this course it’s quite likely that will become the average.
AI might play a significant role in contributing to the crisis. And once VC and government subsidies run out, if these companies have to pay for the externalities, will it still be profitable?
[+] [-] ForHackernews|1 year ago|reply
[+] [-] Dead_Lemon|1 year ago|reply
Then the article only accounts for Midjourney, which is only one of the operators in this space. All the major tech companies have huge fleets of server working on this.
[+] [-] JKCalhoun|1 year ago|reply
Is it because these other "services" energy use are in the noise?