Technically it is inefficiency. The electricity should be doing computer things. Heat is wasted electricity. Just there's not much the data centre could do about it.
Even if the computer does perfectly-efficient computer things with every Joule, every single one of those Joules ends up as one Joule of waste heat.
If you pull 100W of power out of an electric socket, you are heating your environment at 100W of power completely independent of what you use that electricity for.
Only true for our current computers and not true with reversible computing.
With reversible computing you can use electricity to perform a calculation and then "push" that electricity back into a battery or a capacitor instead of dumping it to the environment.
It's still a huge challenge, but there is a recent promising attempt:
"British reversible computing startup Vaire has demonstrated an adiabatic reversible computing system with net energy recovery"
I read it as the inefficient part isn't the compute efficiency, the inefficient part is dumping all the resulting heat into the environment without capturing it and using it in some way to generate electricity or do work.
On a related/side note, when there's talk about seti and dyson spheres, and detecting them via infrared waste heat, I also don't understand that. Such an alien civilization is seemingly capable of building massive space structures/projects, but then lets the waste heat just pour out into the universe in such insane quantities that we could see it tens/hundreds of light years away? What a waste. Why wouldn't they recover that heat and make use of it instead? And repeat the recovering until the final waste output is too small to bother recovering, at which point we would no longer be able to detect it.
If I turn my fan on and 100% of the electricity is converted to heat, where does the kinetic energy of moving fan blades come from? Even the Trump administration cannot just repeal the law of conservation of energy.
There’s a minimum level of energy consumption (and thus heat) that has to be produced by computation, just because of physics. However, modern computers generate billions of times more heat than this minimum level.
Heat is not by itself waste. It's what electricity turns into after it's done doing computer things. Efficiency is a separate question - how many computer things you got done per unit electricity turned into heat.
How many computer things you got done per unit electricity, and how many mechanical things you do with the temperature gradient between the computer and its heat sync.
For example, kinda wasteful to cook eggs with new electrons when you could use the computer heat to help you denature those proteins. Or just put the heat in human living spaces.
(Putting aside how practical that actually is... Which it isn't)
No its not. It would be waste only if the there is a high temperature gradient, which is minimized in mining operation through proper cooling.
It's that computation requires electricity. And almost all of the heat in bitcoin mining comes from computation, technically changing transistor state.
I think what they mean is that there is not a Carnot engine hooked up between the heat source and sink. Which theoretically something the data center could do about it.
stouset|2 months ago
If you pull 100W of power out of an electric socket, you are heating your environment at 100W of power completely independent of what you use that electricity for.
spyder|2 months ago
"British reversible computing startup Vaire has demonstrated an adiabatic reversible computing system with net energy recovery"
https://www.eetimes.com/vaire-demos-energy-recovery-with-rev...
https://vaire.co/
Short introduction video to reversible computing:
https://www.youtube.com/watch?v=rVmZTGeIwnc
thegrim000|2 months ago
On a related/side note, when there's talk about seti and dyson spheres, and detecting them via infrared waste heat, I also don't understand that. Such an alien civilization is seemingly capable of building massive space structures/projects, but then lets the waste heat just pour out into the universe in such insane quantities that we could see it tens/hundreds of light years away? What a waste. Why wouldn't they recover that heat and make use of it instead? And repeat the recovering until the final waste output is too small to bother recovering, at which point we would no longer be able to detect it.
tasuki|2 months ago
Yes it ends up as heat, but with some forethought, it could be used to eg heat people's homes rather than as waste.
robkop|2 months ago
csomar|2 months ago
anthonj|2 months ago
usrnm|2 months ago
mr_toad|2 months ago
https://en.wikipedia.org/wiki/Landauer's_principle
RhysU|2 months ago
geoffschmidt|2 months ago
anon84873628|2 months ago
For example, kinda wasteful to cook eggs with new electrons when you could use the computer heat to help you denature those proteins. Or just put the heat in human living spaces.
(Putting aside how practical that actually is... Which it isn't)
YetAnotherNick|2 months ago
It's that computation requires electricity. And almost all of the heat in bitcoin mining comes from computation, technically changing transistor state.
anon84873628|2 months ago
charcircuit|2 months ago
unknown|2 months ago
[deleted]
sixtyj|2 months ago