top | item 14894077

(no title)

exratione | 8 years ago

Some humans are likely to become more powerful than other humans some day. Most such humans will by default develop instrumental subgoals that conflict with other human interests. This could have catastrophic consequences. If we don't actively work on control mechanisms and safety of human behavior, this will most likely pose an existential risk to humanity.

Compare and contrast.

I think people make too much of the wrong things in the matter of general artificial intelligence.

discuss

order

tomsthumb|8 years ago

> Some humans are likely to become more powerful than other humans some day. Most such humans will by default develop instrumental subgoals that conflict with other human interests.

Isn't this already true?

nske|8 years ago

It has been true multiple times throughout history. Every time some negative feedback kicks in changing the rules of the game enough for a certain balance to exist. If anything, it seems this balance tends to improve every time -although it might be too early to tell. However it does seem to me that the difference in power among humans (no matter how power is defined) is never big enough to make "other human interests" irrelevant.