(no title)
origin_path | 3 years ago
These two things are in conflict. We could ignore both asteroids and climate change and according to the best known science there'd be very little impact for vast timespans and possibly no impact ever (before humanity is ended by something else like war).
Yes, also for the climate. Look at the actual predictions and it's like a small reduction in GDP growth spread over a very long period of time, and that's assuming the predictions are actually correct when they have a long track record of being not so.
Really stuff like asteroids and climate is a good counter-argument to caring about AI risk. Intellectuals like to hypothesize world-ending cataclysms that only their far sighted expertise can prevent, but whenever these people's predictions get tested against something concrete they seem to invariably end up being wrong. Our society rewards catastrophising far too generously and penalizes being wrong far too little, especially for academics, NGOs etc. It makes people feel or seem smart in the moment, and they can punt the reputational damage from being wrong far into the future (and then pretend they never made those predictions at all or there were mitigating factors).
naasking|3 years ago
That's just incorrect. The Tunguska event was a nuclear-weapon scale asteroid. These are predicted to happen once every hundred years or so. If it happens over a populated city millions would die. If a person wrongly concludes this was a surprise nuclear attack, maybe everyone would die, to say nothing of the real risk that the asteroid itself could be big enough to wipe us all out.
There's a lot of uncertainty around climate change, but changing climate patterns will certainly change resource allocations (fresh water, arable land, etc.). This will lead to shortages in places that were once abundant, which could easily lead to wars in which millions die.
> Really stuff like asteroids and climate is a good counter-argument to caring about AI risk.
This. This boggles my mind. Long-tail risks exist, and burying your head in the sand and pretending they don't just places millions of lives at risk, and potentially the entire human race. You don't have to think these are top priorities, but to dismiss them as complete unimportant is frankly bonkers.
origin_path|3 years ago
Which had very little impact on humanity because it exploded in the middle of the tundra.
"These are predicted to happen once every hundred years or so"
What is predicted exactly, by whom and how were these predictions validated against testable reality given the postulated rareness? If they're so common then why is it so hard to name the last 10? I think in reality these events are very rare and will almost always happen over the oceans, deserts, poles etc where not many people live.
"Long-tail risks exist, and burying your head in the sand and pretending they don't"
They exist and I am not pretending they don't. I am saying that this style of reasoning in which an extremely unlikely event is unfalsifiably and arbitrarily assigned near infinite downsides in order to justify spending time and resources on it, is problematic and as a society we are far too generous towards people who do this.