top | item 2796993

Why Waste Servers' Heat?

10 points| pham | 14 years ago |hardware.slashdot.org | reply

4 comments

order
[+] phamilton|14 years ago|reply
The more compelling discussion to me is why not use open air cooling. I worked in an HPC environment when it was in the single digits outside, and our cold aisle was in the low 70s. (The hot aisle was ~100 F, which felt pretty nice on a cold day).
[+] jwilliams|14 years ago|reply
Intel has been looking at this for a while - and had quite a few successful experiments (OP is probably aware of this). I think the stumbling point is the HDDs, which have a much higher rate of failure when the temperature has a high degree of variation.
[+] pinko|14 years ago|reply
Our data center, which is powered by gas turbines, uses its waste heat to heat nearby buildings in the winter, and to cool the data center in the summer (via absorption chillers).
[+] 4J7z0Fgt63dTZbs|14 years ago|reply
For some reason I'm desperate to see this complete the circle in closed ecosystem that involves "in-house farms" and "air/water purification system" and all those futuristic concept.