top | item 40405047

(no title)

notfish | 1 year ago

The idea is that, because of the risk of a technological explosion, any sentient species is a threat.

I don’t really buy the idea though - cooperation has been the strongest strength of humanity and is one of our greatest evolutionary edges. Why wouldn’t that apply on the interstellar scale too?

discuss

order

Rastonbury|1 year ago

Shoot first and ask questions later, if anything shooting first is safer (civilizational risk) on the chance they are thinking the same. On the flip side, an alien civilization sees our request to cooperate, they can accept or they can destroy us, to them we could be lying or go back on our word and destroy them.

I see some criticism of dark forest theory in here, but keeping quite and shooting first are the least risky options when the inentions or capabilities of another civilization are unknown and making assumptions about the other sides friendliness could lead to eithers extinction.

GoblinSlayer|1 year ago

In terms of absolute agnosticism anything can lead to anything.

ImPostingOnHN|1 year ago

Humans attempt to dominate every nonhuman species within their reach, sentient or not.

Humans have not even advanced beyond attempting to dominate others of the same species.