top | item 44156487

(no title)

tamat | 9 months ago

As a Software Engineer I found it hard to grasp the concepts explained here.

First it says we lose electrons by deleting information. But AFAIK we are losing electrons everywhere, most gates will operate on negation of a current, which I understand is what they refeer to losing electrons. So, are all gates bad now?

Also, why keeping a history of all memory changes will prevent losing heat? You will have to keep all that memory running so...

And finally, why would this be useful? Who needs to go back in time in their computations??

discuss

order

thrance|9 months ago

Theoretically, a computer that never forgets anything can run without consuming any power (and thus never heating). That kind of computer would be called reversible (or adiabatic) as it would require its gates to be reversible (i.e. any computation can be undone). You would still need to expend energy to set the initial state (input) and copy the result (output).

Obviously, in real life, most power consumed by computers is lost by wire resistance, not through "forgetting" memory in logic gates. You would need superconducting wires and gates to build an actually reversible CPU.

Also, you would need to "uncompute" the result of a computation to bring back your reversible computer from its result back to its initial state, which may be problematic. Or you can expend energy to erase the state.

Quantum computers are reversible computers, if you seek a real life example. Quantum logic gates are reversible and can all be inverted.

tamat|9 months ago

Thanks for your explanation

HPsquared|9 months ago

It's a thermodynamics thing. Reversible processes are the most efficient (something to do with entropy). Deleting information means it's no longer reversible. This is an entirely theoretical thing. There are theoretical limits to energy usage of computation based on this, but actual computers are nowhere near these theoretical limits, at all.

Edit: and yes, most of the logical operations in a regular chip like AND, OR, NAND etc are irreversible (in isolation, anyway)

rnhmjoj|9 months ago

> but actual computers are nowhere near these theoretical limits, at all.

The Landauer limit at ambient temperature gives something of the order of 10⁻²¹ J to irreversibly flip a bit. While, if I read this paper[1] correctly, current transistors are around 10⁻¹⁵ J. So, definitely not coming to AI "soon".

[1]: https://arxiv.org/pdf/2312.08595

tamat|9 months ago

thanks for your reply

naasking|9 months ago

> Also, why keeping a history of all memory changes will prevent losing heat?

How much power does a persistent storage (hard drive, SSD) require to preserve its stored data? Zero, which is why it emits zero heat.

> Who needs to go back in time in their computations??

At its most basic level, erasing/overwriting data requires energy. This generates a lot of heat. Heat dissipation is a major obstacle to scaling chips down even further. If you can design a computer that doesn't need to erase nearly as much data, you generate orders of magnitude less heat, and this potentially opens up more scaling potential and considerable power savings.