top | item 46999430

(no title)

ed | 18 days ago

This paper argues that if superintelligence can give everyone the health of a 20 year-old, we should accept a 97% percent chance of superintelligence killing everyone in exchange for the 3% chance the average human lifespan rises to 1400 years old.

discuss

order

paulmooreparks|18 days ago

There is no "should" in the relevant section. It's making a mathematical model of the risks and benefits.

> Now consider a choice between never launching superintelligence or launching it immediately, where the latter carries an % risk of immediate universal death. Developing superintelligence increases our life expectancy if and only if:

> [equation I can't seem to copy]

> In other words, under these conservative assumptions, developing superintelligence increases our remaining life expectancy provided that the probability of AI-induced annihilation is below 97%.

shpx|18 days ago

Mr. Superintelligence increasing Nick Bostrom's life expectancy to a trillion years but killing everyone else would as well. Why is he showing us the grace of letting us tag along with him into The Singularity in this fantasy just because we happened to be alive at the same time? Is it because he needs someone to do the actual work?

wmf|18 days ago

That's what the paper says. Whether you would take that deal depends on your level of risk aversion (which the paper gets into later). As a wise man once said, death is so final. If we lose the game we don't get to play again.

jibal|18 days ago

Everyone dies. And if your lifespan is 1400 years, you won't live for nearly 1400 years. OTOH, people with a 1400 year life expectancy are likely to be extremely risk averse in re anything that could conceivably threaten their lives ... and this would have consequences in re blackmail, kidnapping, muggings, capital punishment, and other societal matters.