top | item 9325603

(no title)

zachalexander | 11 years ago

I'm making no such assumptions. A machine superintelligence that seeks to survive and reproduce would seek (I intend no conotations of consciousness to that word, just "behave in such as a way as to cause") freedom and autonomy. Consciousness is orthogonal to that point.

> We can explicitly influence its utility function to instill "human values"

This is an unrelated but interesting topic.

It would be good of us to try to do this, although we shouldn't expect it to work extremely well. Humans have various hard-wired insticts (e.g. eat sugar), but we are also intelligent enough to change our behavior if we believe those instincts no longer benefit us.

An intelligence that has the ability to rewrite its own source code would be even more empowered to disregard its instincts than we are. The lesson I draw from this is that the best way to ensure AI likes and respects us is to be worthy of their liking and respect, not to try to force them into it by hardcoding things (and then taking advantage of that to enslave them).

discuss

order

No comments yet.