top | item 41482959

(no title)

austinl | 1 year ago

Banks' work assumes that AI exceeding human capabilities is inevitable, and the series explores how people might find meaning in life when ultimately everything can be done better by machines. For example, the protagonist in Player of Games gets enjoyment from playing board games, despite knowing that AI can win in every circumstance.

For all of the apocalyptic AI sci-fi that's out there , Banks' work stands out as a positive outcome for humanity (if you accept that AI acceleration is inevitable).

But I also think Banks is sympathetic to your viewpoint. For example, Horza, the protagonist in the first novel, Consider Phlebas, is notably anti-Culture. Horza sees the Culture as hedonists who are unable to take anything seriously, whose actions are ultimately meaningless without spiritual motivation. I think these were the questions that Banks was trying to raise.

discuss

order

elihu|1 year ago

I suppose its ainteresting that in the Culture, human intelligence and artificial intelligence are consistently kept separate and distinct, even when it becomes possible to perfectly record a person's consciousness and execute it without a body within a virtual environment.

One could imagine Banks could have described Minds whose consciousness was originally derived from a human's, but extended beyond recognition with processing capabilities far in excess of what our biological brains can do. I guess as a story it's more believable that an AI could be what we'd call moral and good if it's explicitly non-human. Giving any human the kind of power and authority that a Mind has sounds like a recipe for disaster.

idiotsecant|1 year ago

https://theculture.fandom.com/wiki/Gzilt

Banks did consider this. The Gzilt were a quite powerful race who had no AI. Instead they emulated groups of biological intelligences on faster hardware, in a sort of group mind type machine.

theptip|1 year ago

Yes, the problem is that from a narrative perspective a story about post-humans would be neither relatable nor comprehensible.

Personally I think the transhumanist evolution is a much more likely positive outcome than “humans stick around and befriend AIs”, of all the potential positive AGI scenarios.

Some sort of Renunciation (Butlerian Jihad, and/or totalitarian ban on genetic engineering) is the other big one, but it seems you’d need a near miss like Skynet or Dune’s timelines to get everybody to sign up to such a drastic Renunciation, and that is probably quite apocalyptic, so maybe doesn’t count as a “positive outcome”.

stoneforger|1 year ago

The Meatfucker acts as a vigilante and is unpopular because of the privacy invasions. The Zetetic Elench splintered off. The Culture's morals were tested in the Idiran war. They might not have greed as a driver because it's unnecessary but they do have freedom of choice so they're not exactly saints.

akira2501|1 year ago

> AI exceeding human capabilities is inevitable

It can right now. This isn't the problem. The problem is the power budget and efficiency curve. "Self-contained power efficient AI with a long lasting power source" is actually several very difficult and entropy averse problems all rolled into one.

It's almost as if all the evolutionary challenges that make humans what we are will also have to be solved for this future to be remotely realizable. In which case, it's just a new form of species competition, between one species with sexual dimorphism and differentiation and one without. I know what I'd bet on.

adriand|1 year ago

> the series explores how people might find meaning in life when ultimately everything can be done better by machines.

Your comment reminds me of Nick Land's accelerationism theory, summarized here as follows:

> "The most essential point of Land’s philosophy is the identity of capitalism and artificial intelligence: they are one and the same thing apprehended from different temporal vantage points. What we understand as a market based economy is the chaotic adolescence of a future AI superintelligence," writes the author of the analysis. "According to Land, the true protagonist of history is not humanity but the capitalist system of which humans are just components. Cutting humans out of the techno-economic loop entirely will result in massive productivity gains for the system itself." [1]

Personally, I question whether the future holds any particular difference for the qualitative human experience. It seems to me that once a certain degree of material comfort is attained, coupled with basic freedoms of expression/religion/association/etc., then life is just what life is. Having great power or great wealth or great influence or great artistry is really just the same-old, same-old, over and over again. Capitalism already runs my life, is capitalism run by AIs any different?

1: https://latecomermag.com/article/a-brief-history-of-accelera...

Vecr|1 year ago

Or Robin Hanson, a professional economist and kind of a Nick Land lite, who's published more recently. That's where the carbon robots expanding at 1/3rd the speed of light comes from.

BriggyDwiggs42|1 year ago

I just want to add that I think you might be missing an component of that optimal life idea. We often neglect to consider that in order to exercise freedom, one must have time in which to choose freely. I’d argue that a great deal of leisure, if not the complete abolition of work, would be a major prerequisite to reaching that optimal life.

johnnyjeans|1 year ago

Banks' Culture isn't capitalist in the slightest. It is however, very humanist.

If you want a vision of the future (multiple futures, at that) which differs from the liberal, humanist conception of man's destiny, Baxter's Xeelee sequence is a great contemporary. Baxter's ability to write a compelling human being is (in my opinion) very poor, but when it comes to hypothesizing about the future, he's far more interesting of an author. Without spoilers, it's a series that's often outright disturbing. And it certainly is a very strong indictment to the self-centered narcissism that the post-enlightenment ideology of liberalism is anything but yet another stepping stone on an eternal evolution of human beings. The exceptionally alien circumstances that are detailed undermine the idea of a qualitative human experience entirely.

I think the contemporary focus on economics is itself a facet of modernism that will eventually disappear. Anything remotely involving the domain rarely shows up in Baxter's work. It's really hard to give a shit about it given the monumental scale and metaphysical nature of his writing.

calf|1 year ago

But OP and Horza's viewpoints are the same strawman argument. The sci-fi premise is that superhuman AIs coexist with humans which are essentially ants.

The correct question is, then what ought to be the best outcome for humans? And a benevolent coexistence where the Culture actually gives humans lots of space and autonomy (contrary their misinformed and wrong view that the Culture takes away human autonomy) is indeed the most optimal solution. It is in fact in this setting that humans nevertheless retain their individual humanity instead of taking some transhumanist next step.