top | item 46361304

(no title)

dangalf | 2 months ago

Technically it is inefficiency. The electricity should be doing computer things. Heat is wasted electricity. Just there's not much the data centre could do about it.

discuss

order

stouset|2 months ago

Even if the computer does perfectly-efficient computer things with every Joule, every single one of those Joules ends up as one Joule of waste heat.

If you pull 100W of power out of an electric socket, you are heating your environment at 100W of power completely independent of what you use that electricity for.

spyder|2 months ago

Only true for our current computers and not true with reversible computing. With reversible computing you can use electricity to perform a calculation and then "push" that electricity back into a battery or a capacitor instead of dumping it to the environment. It's still a huge challenge, but there is a recent promising attempt:

"British reversible computing startup Vaire has demonstrated an adiabatic reversible computing system with net energy recovery"

https://www.eetimes.com/vaire-demos-energy-recovery-with-rev...

https://vaire.co/

Short introduction video to reversible computing:

https://www.youtube.com/watch?v=rVmZTGeIwnc

thegrim000|2 months ago

I read it as the inefficient part isn't the compute efficiency, the inefficient part is dumping all the resulting heat into the environment without capturing it and using it in some way to generate electricity or do work.

On a related/side note, when there's talk about seti and dyson spheres, and detecting them via infrared waste heat, I also don't understand that. Such an alien civilization is seemingly capable of building massive space structures/projects, but then lets the waste heat just pour out into the universe in such insane quantities that we could see it tens/hundreds of light years away? What a waste. Why wouldn't they recover that heat and make use of it instead? And repeat the recovering until the final waste output is too small to bother recovering, at which point we would no longer be able to detect it.

tasuki|2 months ago

> every single one of those Joules ends up as one Joule of waste heat.

Yes it ends up as heat, but with some forethought, it could be used to eg heat people's homes rather than as waste.

robkop|2 months ago

Interesting question - how much will end up as sound, or in the ever smaller tail of things like storing a bit in flash memory?

csomar|2 months ago

Theoretically, if your computation is energy efficient, you won't need any electricity at all since the real computation costs zero energy.

anthonj|2 months ago

This violates energy conservation principles. Some power will be "wasted" into heat, some other will be used for some other work.

usrnm|2 months ago

If I turn my fan on and 100% of the electricity is converted to heat, where does the kinetic energy of moving fan blades come from? Even the Trump administration cannot just repeal the law of conservation of energy.

mr_toad|2 months ago

There’s a minimum level of energy consumption (and thus heat) that has to be produced by computation, just because of physics. However, modern computers generate billions of times more heat than this minimum level.

https://en.wikipedia.org/wiki/Landauer's_principle

RhysU|2 months ago

It'd be super fun to take that as an axiom of physics then to see how far upwards one could build from that. Above my skills by far.

geoffschmidt|2 months ago

Heat is not by itself waste. It's what electricity turns into after it's done doing computer things. Efficiency is a separate question - how many computer things you got done per unit electricity turned into heat.

anon84873628|2 months ago

How many computer things you got done per unit electricity, and how many mechanical things you do with the temperature gradient between the computer and its heat sync.

For example, kinda wasteful to cook eggs with new electrons when you could use the computer heat to help you denature those proteins. Or just put the heat in human living spaces.

(Putting aside how practical that actually is... Which it isn't)

YetAnotherNick|2 months ago

No its not. It would be waste only if the there is a high temperature gradient, which is minimized in mining operation through proper cooling.

It's that computation requires electricity. And almost all of the heat in bitcoin mining comes from computation, technically changing transistor state.

anon84873628|2 months ago

I think what they mean is that there is not a Carnot engine hooked up between the heat source and sink. Which theoretically something the data center could do about it.

charcircuit|2 months ago

The electricity is doing computer things, building bitcoin blocks.

sixtyj|2 months ago

They could make a second floor with eggs and newborn chicken. /s