top | item 43544197

(no title)

iteratethis | 11 months ago

Just out of curiosity, I wish online LLMs would show real-time power usage and actual dollar costs as you interact with it. It would be so insightful to understand to which degree the technology is subsidized and what the actual value/cost ratio is.

I've read somewhere that generating a single AI image draws as much power as a full smartphone charge.

In case the suspicion is true that costs are too high to be monetized, then the current scale-up phase is going to be interesting. Right now people infrequently have a chat with AI. That's quite a different scenario from having it integrated across every stack and it constantly being used in the background, by billions of people.

Late as they may be, for the consumer space I think Apple is clever to push as much as possible to the local device.

discuss

order

Saigonautica|11 months ago

Also out of curiosity, I did some quick math regarding that claim you read somewhere.

Cellphone battery charge: I have a 5000mAh cellphone battery. If we ignore charging losses (pretty low normally, but not sure at 67W fast charging)... That battery stores about 18.5 watt-hours of energy, or about 67 kilojoules.

Generating a single image at 1024x1024 resolution with Stable Diffusion on my PC takes somewhere under a minute at a maximum power draw under 500W. Lets cap that at 500*60 = 30 kilojoules.

So it seems plausible that for cellphones with smaller batteries, and/or using intense image generation settings, there could be overlap! For typical cases, I think that you could get multiple (but low single digit) of AI generated images for the power cost of a cellphone charge, maybe a bit better at scale.

So in other words, maybe "technically incorrect" but not a bad approximation to communicate power use in terms most people would understand. I've heard worse!

elpocko|11 months ago

My PC with a 3060 draws 200 W when generating an image and it takes under 30 seconds at that resolution, in some configurations (LCM) way under 10 seconds. That's a low end GPU. High end GPUs can generate at interactive frame rates.

You can generate a lot of images with the energy you would use to play a game instead for two hours; generating an image for 30 seconds uses the same amount of energy as playing a game on the same GPU for 30 seconds.

gdhkgdhkvff|11 months ago

One point missing from this comparison is that cell phones just don’t take all that much electricity to begin with. A very rough calculation is that it takes around 0.2 cents to fully charge a cell phone. You spend maybe around $1 PER YEAR on cell phone charging per phone. Cell phones are just confusingly not energy intensive.

mrob|11 months ago

How about if you cap the power of the GPU? Modern semiconductors have non-linear performance:efficiency curves. It's often possible to get big energy savings with only small loss in performance.

bluefirebrand|11 months ago

> Generating a single image at 1024x1024 resolution

That's not a very big image, though. Maybe if this were 25 years ago

You should at least be generating 1920x1080, pretend you're making desktop backgrounds from 10 years ago

facile3232|11 months ago

> Generating a single image at 1024x1024 resolution with Stable Diffusion on my PC takes somewhere under a minute at a maximum power draw under 500W

That's insane, holy shit. That's not even a very large image.

Apparently I was off on my estimates about how power hungry gpus are these days by an order of magnitude.

nkrisc|11 months ago

A 1024x1024 image seems like an unrealistically small image size in this day and age. That’s closer to an icon than a useful image size for display purposes.

tzs|11 months ago

> I've read somewhere that generating a single AI image draws as much power as a full smartphone charge.

To put that in perspective, using the 67 kJ of energy for a smartphone charge given in Saigonautica's comment you can charge a smartphone 336 times for $1 if you are paying the average US residential electricity rate of just under $0.16/kWh.

You could charge a smartphone 128 times for $1 if you were in the state with the most expensive electricity (Hawaii) and paying the average rate there of around $0.42.

Saigonautica's battery is on the large size. It's a little bigger than the battery of an iPhone 16 Pro Max. A plain iPhone 16 could be charged 470 times for $1 at average US residential electricity prices.

For most people energy used to charge a smartphone is in the "this is too small to ever care about" category.

We can do a similar calculation for AA rechargeable batteries, and the results might be surprising.

$1 of electricity at the US average residential rate is enough to recharge an AA Eneloop nearly 2300 times. Of course there are inefficiencies in the charger and charging, but if we can get even 75% efficiency that's good enough for more then 1700 charges.

That really surprised me when I first learned it. I knew it wasn't going to be a lot...but 1700 charges is I think more than the number of times I'll swap out an AA battery over my entire lifetime. I hadn't expected that all my AA battery use for my whole life would be less than $1 worth of electricity.

chii|11 months ago

> It would be so insightful to understand to which degree the technology is subsidized and what the actual value/cost ratio is.

it would be insightful for competitors too, because they could use this as part of their analysis and price strategies against you.

Therefore, no company would possibly allow such data to be revealed.

And in any case, if these LLM providers burn cash to provide a service to you, then you ought to take maximal advantage of it. Just like how uber subsidized rides.

polytely|11 months ago

feel like if they did this the whole AI bubble would pop

keyringlight|11 months ago

It's not just Apple integrating AI into the hardware, Microsoft has been part of a big push to "AI PCs" with a certain minimum capabilities (and I'm sure their partners don't mind selling new gear) and the copilot button on keyboards, and certain android models have the processors and memory capacities specifically for running AI

rchaud|11 months ago

> It would be so insightful to understand to which degree the technology is subsidized and what the actual value/cost ratio is.

For whom would this be beneficial? The design goals of these products are to get as many users as fast as possible, using it for as long as possible. "Don't make me think" is the #1 UX principle at work here. You wouldn't expect a gas pump terminal to tut-tut about your carbon emissions.

kosh2|11 months ago

How much energy does it cost for a human to generate an image?

card_zero|11 months ago

You mean, how much extra energy, compared to what the human was going to do instead? It might be a negative amount. But that might be a bad thing, an artist could get fat.