top | item 43644640

(no title)

kogus | 10 months ago

I think we need to consider what the end goal of technology is at a very broad level.

Asimov says in this that there are things computers will be good at, and things humans will be good at. By embracing that complementary relationship, we can advance as a society and be free to do the things that only humans can do.

That is definitely how I wish things were going. But it's becoming clear that within a few more years, computers will be far better at absolutely everything than human beings could ever be. We are not far even now from a prompt accepting a request such as "Write a another volume of the Foundation series, in the style of Isaac Asimov", and getting a complete novel that does not need editing, does not need review, and is equal to or better than the quality of the original novels.

When that goal is achieved, what then are humans "for"? Humans need purpose, and we are going to be in a position where we don't serve any purpose. I am worried about what will become of us after we have made ourselves obsolete.

discuss

order

mperham|10 months ago

> When that goal is achieved, what then are humans "for"? Humans need purpose, and we are going to be in a position where we don't serve any purpose. I am worried about what will become of us after we have made ourselves obsolete.

Read some philosophy. People have been wrestling with this question forever.

https://en.wikipedia.org/wiki/Philosophy

In the end, all we have is each other. Volunteer, help others.

quxbar|10 months ago

It depends on what you are trying to get out of a novel. If you merely require repetitions on a theme in a comfortable format, Lester Dent style 'crank it out' writing has been dominant in the marketplace for >100 years already (https://myweb.uiowa.edu/jwolcott/Doc/pulp_plot.htm).

Can an AI novel add something new to the conversation of literature? That's less clear to me because it is so hard to get any model I work with to truly stand by its convictions.

lm28469|10 months ago

You could have said the same thing when we invented the steam engine, mechanized looms, &c. As long as the driving force of the economy/technology is "make numbers bigger" there is no end in sight, there will never be enough, there is no goal to achieve.

We already live lives which are artificial in almost every way. People used to die of physical exhaustion and malnutrition, now they die of lack of exercise and gluttony, surely we could have stopped somewhere in the middle. It's not a ressource or technology problem at that point, it's societal/political

charlie0|10 months ago

It's the human scaling problem. What systems can be used to scale humans to billions while providing the best possible outcomes for everyone? Capitalism? Communism?

Another possibility is not let us scale. I thought Logan's Run was a very interesting take on this.

jillesvangurp|10 months ago

Evolution is not about being better / winning but about adapting. People will adapt and co-exist. Some better than others.

AIs aren't really part of the whole evolutionary race for survival so far. We create them. And we allow them to run. And then we shut them down. Maybe there will be some AI enhanced people that start doing better. And maybe the people bit become optional at some point. At that point you might argue we've just morphed/evolved into whatever that is.

dominicrose|10 months ago

> I think we need to consider what the end goal of technology is at a very broad level.

"we" don't control ourselves. If humans can't find enough energy sources in 2200 it doesn't mean they won't do it in 1950.

It would be pretty bad to lose access to energy after having it, worse than never having it IMO.

The amount of new technologies discovered in the past 100 years (which is a tiny amount of time) is insane and we haven't adapted to it, not in a stable way.

norir|10 months ago

This is undeniably true. The consequences of a technological collapse at this scale would be far greater than having never had it in the first place. For this reason, the people in power (in both industry and government) have more destructive potential than at any time in human history by far. And they do not act like they have little to no awareness of the enormous responsibility they shoulder.

empath75|10 months ago

> But it's becoming clear that within a few more years, computers will be far better at absolutely everything than human beings could ever be.

Comparative advantage. Even if that's true, AI can't possibly do _everything_. China is better at manufacturing pretty much anything than most countries on earth, but that doesn't mean China is the only country in the world that does manufacturing.

Philpax|10 months ago

> AI can't possibly do _everything_

Why not? There's the human bias of wanting to consume things created by humans - that's fine, I'm not questioning that - but objectively, if we get to human-threshold AGI and continue scaling, there's no reason why it couldn't do everything, and better.

belter|10 months ago

- Despite the flood of benchmark-tuned LLMs, we remain nowhere close to engineering a machine intelligence rivaling that of a cat or a dog, let alone within the next 5 to 10 years.

- The world already hosts millions of organic AI (Actual Intelligence). Many statistically at genius-level IQ. Does their existence make you obsolete?

Philpax|10 months ago

> Despite the flood of benchmark-tuned LLMs, we remain nowhere close to engineering a machine intelligence rivaling that of a cat or a dog, let alone within the next 5 to 10 years.

Depends on your definition of "intelligence." No, they can't reliably navigate the physical world or have long-term memories like cats or dogs do. Yes, they can outperform them on intellectual work in the written domain.

> Does their existence make you obsolete?

Imagine if for everything you tried to do, there was someone else who could do it better, no matter what domain, no matter where you were, and no matter how hard you tried. You are not an economically viable member of society. Some could deal with that level of demoralisation, but many won't.

foobarian|10 months ago

> what then are humans "for"?

Folding laundry

giraffe_lady|10 months ago

Here's a passage from a children's book I've been carrying around in my heart for a few decades:

“I don't like cleaning or dusting or cooking or doing dishes, or any of those things," I explained to her. "And I don't usually do it. I find it boring, you see."

"Everyone has to do those things," she said.

"Rich people don't," I pointed out.

Juniper laughed, as she often did at things I said in those early days, but at once became quite serious.

"They miss a lot of fun," she said. "But quite apart from that--keeping yourself clean, preparing the food you are going to eat, clearing it away afterward--that's what life's about, Wise Child. When people forget that, or lose touch with it, then they lose touch with other important things as well."

"Men don't do those things."

"Exactly. Also, as you clean the house up, it gives you time to tidy yourself up inside--you'll see.”

rqtwteye|10 months ago

A while ago I saw a video of a robot doing exactly that. Seems there is nothing left for us to do.

nthingtohide|10 months ago

> Humans need purpose.

Let me paint a purpose for you which could take millions of years. How about building a Atomic Force microscope equivalent which can probe Calabi Yau manifolds to send messages to other multiverses.

shortrounddev2|10 months ago

You can have an LLM crank out words but you can't make them mean anything

20after4|10 months ago

Suno is pretty good at going from a 3 or 4 word concept to make a complete song with lyrics, melody, vocals, structure and internal consistency. I've been thoroughly impressed. The songs still suck but they are arguably no worse than 99% of what the commercial music business has been pumping out for years. I'm not sure AI is ready to invent those concepts from nothing yet but it may not be far off.

Philpax|10 months ago

Meaning is in the eye of the beholder. Just look at how many people enjoyed this and said it was "just what they needed", despite it being composed of entirely AI-generated music: https://www.youtube.com/watch?v=OgU_UDYd9lY

js8|10 months ago

> By embracing that complementary relationship, we can advance as a society and be free to do the things that only humans can do.

This complementarity already exists in our brains. We have evolutionary older parts of brain that deal with our basic needs through emotions and evolutionary younger neocortex that deals with rational thought. They have complicated relationship, both can influence our actions, through mutual interaction. Morality is managed by both, neither of them is necessarily more "humane" than the other.

In my view, AI will be just another layer, an additional neocortex. Our biological neocortex is capable of tracking un/cooperative behavior of around 100 people of the tribe, and allows us to learn couple useful skills for life.

The "personal AI neocortex" will track behavior of 8 billion people on the planet, and will have mastery of all known skills. It is gonna change humans for the better, I have little doubt about it.