top | item 35377380

“Symbiosis” as a solution / objective for AGI versus “Alignment problem”

3 points| zoroaster | 2 years ago

Examples of symbiosis exist everywhere in nature. We don't need to recreate the wheel folks...

The "key word" here is "SYMBIOSIS". We have been myopically focused on "alignment" when what we really want is to cultivate (both from a human perspective and AI perspective) a symbiotic relationship between humans and AI. Consequentially, a symbiotic relationship between humans and AI (and AGI understanding of incentives and preference for symbiosis over parasitism) can help to establish a more symbiotic relationship between human beings and the planet.

Given examples of symbiotic relationships between very different organisms are prevalent in nature, pointing to this as an example paradigm for AI / AGI is much preferable than trying to push a concept of "alignment" that is not clearly defined or generally understood.

4 comments

order

JohnFen|2 years ago

> than trying to push a concept of "alignment" that is not clearly defined or generally understood.

True, I have no good idea of what people mean when they say "alignment". But I also don't have a clear idea of what you mean when you say "symbiosis".

What is your definition of that in this context? What would it look like?

zoroaster|2 years ago

https://www.merriam-webster.com/dictionary/symbiosis

It is widely defined, established, and prevalent in nature. Pointing an AGI to "symbiosis" will give it a plethora of examples and ways to optimize for a symbiotic relationship with humanity and the planet. It is a much, much better optimizing function than some stupid human-centric "Alignment-problem" we are trying to solve for.