top | item 35735457

(no title)

jtaillon | 2 years ago

What a strange take. It reads as "nothing is worth improving on if it cannot 100% completely replace the thing that we're trying to get rid of". If we can reduce our dependence (say 50%) on a power source that has huge negative externalities, isn't that a worthwhile effort?

discuss

order

outworlder|2 years ago

I've seen this kind of thinking very often.

Like the controversy when traffic light bulbs were replaced and incandescent were phased away in favor of LEDs. In some places where it gets very cold they got obstructed by ice. Because incandescent bulbs were so inefficient and generated so much heat, they didn't have the same "problem". So, because they could not handle the few days where this was a problem (just because it's cold, doesn't mean ice will form there), people were arguing that they were a bad idea.

It didn't matter that for 300+ days they would save a lot of energy. It didn't matter that they were better (and in most locations this would never even be a problem in the first place). It mattered that they were different and had different issues that needed to be solved. In those places, having a different design (or adding a heating element, only active when temps were low) would solve the remaining problem.

I've seen this sort of thinking in corporate settings too. A vastly inferior solution stays in place because a new solution doesn't solve all problems people can possibly think of - even when the existing solution doesn't solve them either!

It boggles the mind.

oblio|2 years ago

Technology Connections: https://www.youtube.com/watch?v=GiYO1TObNz8

What's worse is that he points out you could have a lighting system with integrated defroster that only gets turned on when it's super cold, and you're STILL using less energy overall.