top | item 38387150

(no title)

synaesthesisx | 2 years ago

Remember, about a month ago Sam posted a comment along the lines of "AI will be capable of superhuman persuasion well before it is superhuman at general intelligence, which may lead to very strange outcomes".

The board was likely spooked by the recent breakthroughs (which were most likely achieved by combining transformers with another approach), and hit the panic button.

Anything capable of "superhuman persuasion", especially prior to an election cycle, has tremendous consequences in the wrong hands.

discuss

order

sampo|2 years ago

> Remember, about a month ago Sam posted a comment along the lines of "AI will be capable of superhuman persuasion well before it is superhuman at general intelligence, which may lead to very strange outcomes".

Superhuman persuasion is Sam's area of expertise, so he would make that a priority when building chatbots.

somenameforme|2 years ago

It seems much more likely that this was just referring to the ongoing situation with LLMs being able to create exceptionally compelling responses to questions that are completely and entirely hallucinated. It's already gotten to the point that I simply no longer use LLMs to learn about topics I am not already extremely familiar with, simply because hallucinations end up being such a huge time waster. Persuasion without accuracy is probably more dangerous to their business model than the world, because people learn extremely quickly not to use the models for anything you care about being right on.

pbourke|2 years ago

Sounds like we need an AI complement to the Gell-Mann Amnesia effect.

thepasswordis|2 years ago

But they didn’t hit the panic button. They said Sam lied to them about something and fired him.

adastra22|2 years ago

According to this article Sam has been telling the board that this new advance is not AGI and not anything to worry about (so they can keep selling it to MSFT), then the researchers involved went behind Sam's back and reported to the board directly, claiming that they'd created something that could-maybe-be AGI and it needs to be locked down.

That's the claim at least.

meheleventyone|2 years ago

Looking at humanity, persuasion seems to be an extremely low bar! Also for a superhuman trait is it that it’s capable of persuading anyone anything or rather that it’s able to persuade everyone about something. Power vs. Reach.

93po|2 years ago

I agree with this. Corporate news is complete and total obvious bullshit, but it overwhelmingly informs how people think about most anything.

column|2 years ago

"especially prior to an election cycle"

It looks like you are referring to the USA elections.

1. humanity != USA

2. USA are in a constant election cycle

3. there are always elections coming around the world, so it's never a good time

alkonaut|2 years ago

I agree with this conclusion and it's also why I'm not that afraid of the AGI threat to the human race. AGI won't end the human race if "superhuman persuation" or "deception-as-a-service" does it first.

nopromisessir|2 years ago

I feel this could be used in positive ways.

Superhuman pursuation to do good stuff.

That'll be a weird convo what is 'good'.

ronhews|2 years ago

Understandably, the board may be concerned about the potential consequences of AI-powered superhuman persuasion, particularly during an election cycle. Combining transformers with other approaches has led to recent breakthroughs, which could have significant implications.

__MatrixMan__|2 years ago

We've built the web into a giant Skinner box. I find the claim dubious, but this is the sort of thing we ought to find at the forefront of our technology. It's where we've been going for a long time now.

Exoristos|2 years ago

Which party is "the wrong hands"?

Liquix|2 years ago

Any party with sufficient resources and motive to influence the outcome of an election. Outside of election season, this tech would be very dangerous in the hands of anyone seeking to influence the public for their own gain.

latexr|2 years ago

The original commenter didn’t mention a party. Please don’t polarise the discussion into a flame war. Whatever system exists won’t be used by “a party” all at once, but by individuals. Any of those, with any political affiliation, can be “the wrong hands”.

I’ll offer a simple definition. The role of government is to serve the greater good of all people, thus the wrong hands are the ones which serve themselves or their own group above all.

bakuninsbart|2 years ago

Both? Parties in a democracy aren't supposed to be shepherds of the stupid masses, I know manipulation and misinformation is par for the course on both sides of the aisle, but that's a huge problem. Without informed, capable citizens, democracy dies a slow death.

PaulDavisThe1st|2 years ago

Except that there's a fairly large body of evidence that persuasion is of limited use in shifting political opinion.

So the persuasion would need to be applied to something other than some sort of causative political-implication-laden argument.

hnthrowaway0315|2 years ago

Or, let's say, you don't need a lot of persuasion to guide an election. I mean we already have X, FB, and an army of bots.

hackerlight|2 years ago

> Except that there's a fairly large body of evidence that persuasion is of limited use in shifting political opinion.

The Republican Party's base became isolationist and protectionist during 2015 and 2016 because their dear leader persuaded them.

naasking|2 years ago

Even if it were true that human persuasion is of limited use in shifting opinions, the parent posted is talking about superhuman persuasion. I don't think we should just assume those are equally effective.

jjeaff|2 years ago

When you say persuasion, are you referring to fact based, logical argument? Because there are lots of other types of persuasion and certainly some work very well. Lying and telling people what they want to hear without too many details while dog whistling in ways that confirm their prejudices seems to be working pretty well for some people.