The argument I've heard is (and unfortunately I can't find offhand where I've heard this), is basically rather than having new computers made and all the vast energy usage required to do so (mining/refining metals, computer usage to design new hardware, factories assembling stuff, electronics manufacturing, packaging, shipping, and all the pollution from these processes), it's far less harmful to the environment to just keep using what you have, as long as you can. The impact of continuing to use that old computer is even less when your source of energy is a renewable resource like solar or hydroelectric.
Nasrudith|7 days ago
It hasn't been true for servers either, as reflected by the resale price of old server hardware. It turns out power over a long time frame dominates over the manufacturing costs. From what I've seen, the argument is just bad math and bad assumptions all the way down at best. At worst it is willful ignorance in service of validating their assumptions regardless of the truth.
amatecha|7 days ago
Yeah, servers are far more power-efficient than they used to be, but that's not really what millions of households worldwide are constantly buying.
Here, since I didn't have any links/quotes initially, I figured I should spend the time to dig some up.
https://doi.org/10.1016/j.jclepro.2011.03.004 > The manufacturing phase represents 62–70% of total primary energy of manufacturing and operation.
https://web.mit.edu/2.813/www/readings/Williams%20-%20Energy... > life cycle energy use of a computer is dominated by production (81%) as opposed to operation (19%).