top | item 35374937

(no title)

johnaspden | 2 years ago

Back in the day when Eliezer was optimistic about AI (there's a reason why the SIAI/MIRI logo looks like an angel), I wrote him a very silly e-mail warning him that, despite being Jewish, he was falling for the Christian rapture narrative in a new form!

It seemed to me that his ideas, which I think are all correct, straightforwardly implied the destruction of all things, with very little hope.

I bite that bullet, and I always have.

Over the years, Eliezer has lost all hope, and things have happened much faster than either of us thought they would, and now he's making one final roll of the dice, burning his reputation by saying out loud what he thinks, in the hope that someone might listen.

I'd be very surprised if he thinks this will work. He's trying to get a warning out to the general population, in spite of this meaning that no one inside AI will ever trust him again. He thinks it no longer matters.

Me too.

discuss

order

labrador|2 years ago

Very interesting and now I'm motivated to learn more about his thoughts. I developed a fascination with this kind of thinking since I gave a friend a ride to the San Francisco airport in 1978 in my VW bus so he could fly to Jonestown, Guyana to join the Jim Jones church. I never heard from him again. Don't know if he drank the Kool-Aid. Too bad too, because he was a fantastic drummer.

I wonder if we'll have more mass suicides, like the Nike wearing techies following Marshall Applewhite, by those who fear it's hopeless to resist AI?

labrador|2 years ago

So I listened to the first 7:56 of this interview and imho he's out of his f*ckin' mind and spreading dangerous disinformation. "We don't know if there's a person in LLMs", "We understand more about the human mind that we do about LLMs despite having complete read access". He's insane. Nobody should listen to him.

Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization | Lex Fridman Podcast #368

https://www.youtube.com/watch?v=AaTRHFaaPG8