This is referenced in a sci fi book "The dark forest" of the series "The 3 body problem". It sets a convincing narrative that because of time taken for observation and response and development speed of society it is most likely that all civilizations that announce themselves would likely be a threat in terms of technological supremacy eventually to observing civilizations. In other words, we don't hear anything because any sufficiently advanced civilization would not want to risk being discovered. I.e., the "dark silent forest".
Balgair|4 months ago
I dunno, it just reeks of the culture of suspicion in communist China. A product of that place and time.
My own idea is the 'used car salesman' idea of the universe. (Reeking of my own mind and place and time). To me, economics will rule in the galactic community. In that water, metals, energy, it's all cheap and everywhere. No need to have any competition over it. No, the only scarce thing is life and then even more it's intelligence. Any other civilization will be desperate to get rights over us and our history.
So, to me, the aliens will come to us loud and proud. Balloons and banners.
And of course, a contract as long as a the rings of Saturn, with print as small as the atoms.
We shouldn't be wary of the weapons, but the lawyers
MoreQARespect|4 months ago
antonvs|4 months ago
protocolture|4 months ago
Charles Stross' Singularity Sky seems the most reasonable to me. Superintelligent computers trade unimaginable technology (their infintely replicable trash) for their most sought after asset (new forms of entertainment) and then just piss off to another world having completely bent our cultural development.
wernerb|4 months ago
lowbloodsugar|4 months ago
vecter|4 months ago
1. Survival is the primary goal of all civilizations.
Agree.
2. Resources in the universe are finite.
True in the theoretical sense, but false in the practical sense.
3. Civilizations cannot be certain of others’ intentions.
Not obviously true or false.
4. Communication is dangerous.
This is such a strong axiom and is almost certainly false.
Its conclusion from applying the four axioms is that preemptive annihilation is the rational strategy.
As an alien civilization, if your strategy for survival in the cosmos is to "immediately and totally annihilate any sign of life", then that is almost a surely losing strategy. If intelligent life is prevalent, and the cost of annihilating a species is so low that they can just do it willy-nilly, then all it takes is one surviving colony to use the same superweapon against you and you're finished. Oh, you'd also have to be annihilating species left and right across the galaxy without revealing your location. And in the worst case, you've just pissed off all the known alien entities in your galactic neighborhood. Good luck to you.
It makes for fun writing, but I don't understand how anyone can take it seriously.
lelanthran|4 months ago
> Not obviously true or false.
"Intentions are uncertain" is true, though.
If you are claiming that it is possible to be certain of other civilisations intentions, I am very skeptical.
socalgal2|4 months ago
wernerb|4 months ago