top | item 46999494

(no title)

paulmooreparks | 18 days ago

There is no "should" in the relevant section. It's making a mathematical model of the risks and benefits.

> Now consider a choice between never launching superintelligence or launching it immediately, where the latter carries an % risk of immediate universal death. Developing superintelligence increases our life expectancy if and only if:

> [equation I can't seem to copy]

> In other words, under these conservative assumptions, developing superintelligence increases our remaining life expectancy provided that the probability of AI-induced annihilation is below 97%.

discuss

order

shpx|17 days ago

Mr. Superintelligence increasing Nick Bostrom's life expectancy to a trillion years but killing everyone else would as well. Why is he showing us the grace of letting us tag along with him into The Singularity in this fantasy just because we happened to be alive at the same time? Is it because he needs someone to do the actual work?