top | item 28427452

(no title)

sarbaz | 4 years ago

The article has some great examples of risks that were hyped out of proportion to their true severity. Should these things be considered? Yes. But only a little. If you halt programs completely for tail risks you'll never get anywhere.

Modern examples, IMO, are:

- Kessler Syndrome

- Trying to prevent an asteroid hitting the earth

- Nuclear war making the entire earth uninhabitable

discuss

order

yesenadam|4 years ago

In the documentary The Man Who Saved The World, Stanislav Petrov travels to meet Kevin Costner, his favourite actor, at home. Costner asks him, if he hadn't acted as he did, how many people would have died. You expect him to say, 50 million, or something. He says, everyone. Everyone on earth would've died. It's a chilling moment. (And there's been a lot more than that one near miss.) What kind of crazy species are we that we build a system that when it malfunctions (as it did that day) seems likely to kill everyone on the planet?!

Now I read on HN that the danger of nuclear war is hyped out of proportion to its true severity, and should be considered only a little. Sorry, maybe I misunderstand. I have read a similar thing on HN a few times though, people that seem to think nuclear war really would be no big deal at all.

But it always seems weird to be how some people are so worried to the point of obsession about global warming without apparently ever giving a thought to the ever-present risk of full-scale nuclear war—something infinitely worse. (Well, hardly "war", just a flurry of button-pressing for a few minutes.)

PeterisP|4 years ago

Nuclear war (especially during the cold war, when we had much more warheads than now) would absolutely be a big deal and a horrible mass death, but it would not have ended humanity.

Like, if there's a scale of catastrophic events that goes from 0 to 10 where 0 is no big deal and 10 is human extinction, then the worst events humanity has ever seen are somewhere below 1 on that scale and absolutely horrific mass death is something like 2/10 - because the gap between the damage required for that and damage required for extinction is so much larger than the gap between no big deal and worse mass death than we have ever seen. Arguably the worst damage that life on Earth has seen is the dinosaur-ending asteroid, and IMHO a fraction homo sapiens (though perhaps not our civilization) could survive even that. A full scale USSR-USA exchange in 1960s might perhaps kill most people in the northern hemisphere and perhaps cause a nuclear winter decreasing crop yields with an associated famine - but if just a fraction of people in South Asia and Africa and South America survive the famine while the North nukes themselves to radioactive glasslands, that's very, very far from extinction.

Killing half of humanity would literally be an unprecedented level of horror, but it would not end our civilization; killing 90% of humanity would likely end our civilization-as-we-know it but would not end our species, that would bring us to the population level that Earth had in 1700s; and killing 99.99% of humanity would definitely destroy our civilization but it would "just" push back our population growth to the numbers we had ~70 000 years ago - horrific for every individual, but still not an extinction event.

mcswell|4 years ago

Ronald Reagan agreed with that, and thought MAD was truly mad.

hn_throwaway_99|4 years ago

> If you halt programs completely for tail risks you'll never get anywhere.

If that "tail risk" though is "complete destruction of the planet", you only get to be wrong once.

JasonFruit|4 years ago

Every action carries that risk somewhere in its very long tail. You have to assess the likelihood of the bad event occurring, and there is a point where it is so unlikely that it need not be considered at all. I don't think humanity is quite stupid enough to knowingly release its Ice-9 just yet.

eloff|4 years ago

> If you halt programs completely for tail risks you'll never get anywhere.

If your tail risk is the end of civilization then it doesn't matter how small the probability. You'd be fucked with certainty on any long enough timeframe.

Some tail risks are to large to take. Eventually your number comes up.

tuatoru|4 years ago

I think you mean "probability" rather than severity. The severity of setting the atmosphere on fire is extreme, but the probability turned out to be vanishingly small. Or perhaps you mean risk, probability times severity. Which also turned out to be negligible for the events in the article.

Some other hyped risks: -

- Artificial General Intelligence

- CRISPR gene editing

- Gain-of-function work with viruses

I have no way of assessing the risks, and there is a lot of hyperventilation in some circles

efitz|4 years ago

Creating a black hole, with the LHC, that consumes the earth