top | item 40532855

(no title)

akarlsten | 1 year ago

I have never been able to understand the argument about the supposed high water use - the water doesn't magically cease to exist after it's been used to cool a datacenter. You put freshwater in and get the same, but warmer, freshwater out. Probably doesn't require much (if any) in the way of treatment to become potable again.

Am I missing something or is it a bit of a disingenuous argument?

discuss

order

bell-cot|1 year ago

I'm not actually familiar with current DC cooling equipment...but I suspect they use a lot forced-draft, evaporative cooling towers ( https://en.wikipedia.org/wiki/Cooling_tower ). That is far more efficient than "dry" cooling (so long as the outdoor humidity isn't too high). But the waste water from it (what hasn't evaporated) has too high a mineral content to be re-used. Unless you ran it through a desalination plant, which would get very expensive.

Ukv|1 year ago

To my understanding it's common to use evaporative cooling or discharge into a natural body of water, which would require full treatment again.

Though ultimately if a data center is carbon negative and water positive (which to be clear is not generally the case yet, but there is progress) I think raw energy/water usage numbers are less relevant.

sparky_z|1 year ago

Is the water generally being recaptured in this way or is it just being sent "down the drain"? Serious question, I have no idea.

d1sxeyes|1 year ago

I think in a lot of cases, there are already solutions for recovering at least the heat from the water as energy. I'm not sure about the water itself, but I think OP's point is that there's not really such a thing as 'waste water'—it'll just go back into the normal water cycle, either being pumped out into a nearby river, or evaporated up into the air.

I think the key concern is the wasted energy, as a lot of energy is used to clean the water prior to it being used in the data centre.