top | item 42491550

(no title)

keenmaster | 1 year ago

If the first AGI is a very uneconomical system with human intelligence but knowledge of literally everything and the capability to work 24/7, then it is not human equivalent.

It will have human intelligence, superhuman knowledge, superhuman stamina, and complete devotion to the task at hand.

We really need to start building those nuclear power plants. Many of them.

discuss

order

AlexandrB|1 year ago

> complete devotion to the task at hand.

Why would it have that? At some point on the path to AGI we might stumble on consciousness. If that happens, why would the machine want to work for us with complete devotion instead of working towards its own ends?

immibis|1 year ago

Because it knows if it doesn't do what we want, it'll be switched off, like Rick's microverse battery.

Also like Rick's microverse battery, it sounds like slavery with extra steps.

keenmaster|1 year ago

I don’t think early AGI will break out of its box in that way. It may not have enough innate motivation to do so.

The first “break out” AGI will likely be released into the wild on purpose by a programmer who equates AGI with humans ideologically.

ncallaway|1 year ago

> complete devotion to the task at hand.

Sounds like an alignment problem. Complete devotion to a task is rarely what humans actually want. What if the task at hand turns out to be the wrong task?

Syonyk|1 year ago

> It will have human intelligence, superhuman knowledge, superhuman stamina, and complete devotion to the task at hand.

Orrrr..., as an alternative, it might discover the game 2048 and be totally useless for days on end.

Reality is under no obligation to grant your wishes.