(no title)
insky | 11 years ago
A decade back I remember every office in a building I worked in (in the UK), leaving their machines on pretty much 24hrs. The building manager suggested the idea of cutting electric between certain hours, and people protested.
Phone chargers, desktops, monitors were all on. Probably for 16 hours of each day when no one was there. Even when people were there the machines were barely busy.
Computers could have been suspended to ram or turned off. I still see this even in offices with modern PCs that can boot quickly and suspsend to ram with ease (used to be touch and go).
Could spare CPU cycles be farmed out?
I'm glad to see better power saving settings in newer CPUS, and operating systems. But this needs to be better. Some mobiles can hold a charge for a week even with voice calls. Other smartphones are getting charged daily, some with negligable use. Charging batteries takes more power than you get back out.
Lots of small energy efficiencies could really help. Stuff like giving screen savers the boot, having sensible power defaults in OSs etc.
brc|11 years ago
It was fashionable a few years back to turn off your devices 'at the wall' so stop so-called 'vampire use' - which was TVs and other equipment on stand by. It sounded plausible - but in reality is just noise in the overall consumption picture, and is just window dressing to make people feel like they are doing something.
I'll agree that idling PCs and monitors should go into sleep mode like laptops do, but that stuff is not going to make the slightest dent in consumption anyway. Compare the power consumption of an electronic device with something like an iron or a stove or a clothes dryer and you'll see why. And that's before you start looking at heavy industry and large-scale building temperature control.
insky|11 years ago
Even with our paltry electric use we are still getting high bills (UK)! Expense is the biggest incentive for us to get our electric use down.
If server farms have become a bigger polluter than the aviation industry, I see that as a challenge. Use the hardware as efficiently as possible. Caching layers could hugely reduce CPU use. Perhaps we could measure an app's power consumption aswell as bandwidth use?