(no title)
huberway | 1 year ago
If at the apex of an organization you have a person who has organized his life in such a way as to have sex with several other people, and if many people involved in the movement pay a tithe to the organization or charities it designates, and if many of the members of this organization go crazy thinking about the impending hell (of AGI), how is this different from a cult?
bhaney|1 year ago
Viliam1234|1 year ago
She also recruited some of her cult members from the community (not sure which ones).
So, if you want, you can frame it as the rationalist community being a dangerous place that attracts sick people. Or you could frame is as anarchists being violent, vegans being intolerant, or trans women being crazy... because the Zizans are all of that. Everyone is free to chose their own story about them.
It may or may not be important that most rationalists / anarchists / vegans / trans women are not crazy murderers, so maybe the story is mostly about Ziz being Ziz and succeeding to get a few (less than ten) followers.
nl|1 year ago
I suspect that as with many off-shots they hate the original community.
https://www.cbsnews.com/news/vermont-border-agent-death-expo... is a better, longer form of this article
[1] https://en.m.wikipedia.org/wiki/Killing_of_David_Maland
throw16180339|1 year ago
unknown|1 year ago
[deleted]
mmooss|1 year ago
rtkwe|1 year ago
palisade|1 year ago
plagiarist|1 year ago
lmm|1 year ago
nl|1 year ago
Some quotes:
> (Alignment Group) would attempt to articulate a ‘demon’ which had infiltrated our psyches from one of the rival groups, its nature and effects, and get it out of our systems using debugging tools
> there were also psychotic breaks involving demonic subprocess narratives,” and where people in positions of power would “debug” underlings. “I experienced myself and others being distanced from old family and friends, who didn't understand how high-impact the work we were doing was,”
> Scott Alexander, maybe the most prominent Rationalist besides Yudkowsky, suggested that the problem was not really M.I.R.I. or C.F.A.R. so much as that Taylor was in a cult-like group centered around a former M.I.R.I. head
> I don’t know that I have the patience or energy to really get to the bottom of it all except to say: It all kinda sounds pretty culty to me! And I haven’t even gotten into the Burning Man camp Black Lotus or the Monastic Academy for the Preservation of Life on Earth
etc
[1] https://maxread.substack.com/p/the-zizians-and-the-rationali...
Manuel_D|1 year ago
throwme0827349|1 year ago
I have met a few "rationalist" types, and I went to a "rationalist" meetup in San Francisco, although they called it something else and didn't care for that label, but couldn't really get other people to stop calling them that.
The overall vibe was like a tech meetup crossed with a church picnic. There were a lot of programmers and grad students there to do a little professional networking, talk about books they like, whether they should be donating to charity a little, which charities worked best, and how to avoid throwing away the leftover cookies.
The subject of AI millennialism was not broached in my presence, all though I did meet some people who were working on AI. If there were any psychos or cult leaders there (or trans people for that matter), I didn't notice, and no one tried to recruit me to anything. It was a totally normal and pleasant experience.
dehugger|1 year ago
fortran77|1 year ago
Probably not, at least not here.
unknown|1 year ago
[deleted]
sva_|1 year ago
This AGI doomerism, which is now also popularized on YouTube etc, is very closely related to the kind of existential questions that mentally unstable people probably ask themselves.
The bar of entry is pretty low, as you need no skills really. You can bootstrap ideas that sound convincing to yourself from nothing pretty quick. That's my hot take anyways.
prododev|1 year ago
Unless you mean the folks who believe AI will become AGI and start hurting people directly. Those folks are pretty fringe.
hollerith|1 year ago
By "AI-doomer", I mean any person or group that believes that AI research is a threat to human survival.