top | item 35796515

(no title)

hderms | 2 years ago

Well in real terms, isn't all it takes is something that's a convincing enough forgery? Like why the turing test was considered interesting at one point. I don't think drawing a line is necessary for us to quickly become caught off guard by developments in this space.

For the record, I don't believe we're close to AGI but I'm also pretty far from knowing anything about that field.

discuss

order

dougmwne|2 years ago

Seems like a no lose bet no matter the odds. If you win, life goes on and you can collect a bit of money. If you lose, money becomes worthless and the surface of the earth gets transformed into a computing substrate.

blagie|2 years ago

I don't know. Having interacted with LLMs at different levels, they resemble a very sophisticated, alien intelligence trying to pretend to be human. It's like me pretending to be a dog; even if I were to emulate a dog perfectly, I wouldn't have the same emotions; I'd be pretending.

We have no idea what emotions, motivations, behaviors, or goals AIs have, will have, or if they'll have something as of yet unconvinced that's not emotions or motivations, but just alien.

We evolved to self-preserve and breed. Modern AIs evolve to pretend to write human text. It's not clear there is any intention to survive, reproduce, or turn the surface of the earth into a computing substrate.

There's a million different dangers -- and I suspect the real ones are ones we haven't conceived of. Whether they'll materialize or how depends on on how we evolve them, and I expect we can't predict it.

To me, much more likely than earth-as-a-computing-substrate is humans-as-brainwashed-consumers. Market forces will push for AIs to write text which draws eyeballs. Those models won't care about truth, ethics, or much of anything other than getting you addicted to reading what they write (or watching what they create). At that point, we can destroy ourselves just fine.

But even more likely is something no one has thought of.

danaris|2 years ago

This presupposes that "AGI" means:

1) The first AGI we create will immediately break free of our control

2) It will either have already been given, or will find some way to take, control of physical systems

3) It will create the Singularity

4) Its goals will be to advance itself at our expense

None of these are remotely givens. Even if we grant that AGI is possible with our current level of technology (which is also not at all a given), the Singularity is nothing but science fiction. It's an interesting idea, but there's no real reason to think it's close to what an AGI's capabilities would be like in reality.

staticman2|2 years ago

There's lots of science fiction that doesn't have A.G.I. destroying the planet.

Commander Data, Hal 9000 (a bad guy but not that bad) Asimov's droids, even the all powerful A.I. of Neuromancer has no particular ill will towards humans.

Belief in AGI hardly means you need to believe in Armageddon.