The worst thing about being smart is how easy it is to talk yourself into believing just about anything. After all, you make really good arguments.
EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything. And from there, you can justify all kinds of terrible things.
Once that happens, it can easily spiral out from there. People who know perfectly well they're misbehaving will claim that they aren't, using the same arguments. It won't hold water, but now we're swamped, and the entire thing crumbles.
I'd love to believe in effective altruism. I already know that my money is more effective in the hands of a food bank than giving people food myself. I'd love to think that could scale. It would be great to have smarter, better-informed people vetting things. But I don't have any reason to trust them -- in part because I know too many of the type of people who get involved and aren't trustworthy.
> EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything. And from there, you can justify all kinds of terrible things.
I came to the same conclusion after a group of my friends got involved with the local rationalist and EA community, though for a different reason: Their drug habits.
They believed themselves to have a better grasp on human nature and behavior than the average person, and therefore believed they were better at controlling themselves. They also had a deep contrarian bias, which turned into a belief that drugs weren’t actually as bad as the system wanted us to believe.
Combine these two factors and they convinced themselves that they could harness recreational opioid use to improve their lives, but avoid the negative consequences that “normies” suffered by doing it wrong. I remember being at a party where several of them were explaining that they were on opioids right now and tried to use the fact that nothing terrible was happening as proof that they were performing rational drug use.
Long story short, the realities of recreational opioid use caught up with them and they were blind to the warning signs due to their hubris. I intentionally drifted away from that group around that time, so I don’t know what happened to them.
I will never forget how confident they were that addiction is something that only happens to other people, not rationalists like them.
Effective Altruism is just a modern iteration of a thing that's been around for a very long time. The fundamental idea is sound. However, in practice, it all-too-easily devolves into something really terrible. Especially once people start down the path of thinking the needs of today aren't as important as the needs of a hypothetical future population.
Personally, I started "tithing" when my first business was a success. In part because it's good to help the less fortunate, but also as an ethical stance. Having a business drove home that no business can be successful without the support of the community it starts in, so it's only right to share in the rewards.
So, I give 10% back. I have rules about it:
I always give to a local group who directly helps people and who is typically overlooked for charitable giving. I get to know the group pretty well first.
I never give to any group that won't keep my identity a secret.
I never give to any group that asks me for money.
I don't always give in the form of money. Sometimes, it's in the form of my time and effort, or in material goods, etc.
I don't give to "umbrella" groups whose purpose is fundraising for a collection of other groups. This isn't because I have a problem with them, but because they're not the ones who struggle the most to get donations.
About fifteen years ago I got involved with the Skepticism movement. It was great to meet so many seemingly rational people and there was excitement that the movement seemed on the verge of growing large enough to make a paradigm shift in how society acts. But after about a year, I started to really sour on it as I could see more and more things becoming rational by decree and thus beyond question. These were supposed to be skeptics, but they were getting more and more into group think, especially on adopting woo-woo views on medical matters.
I had never been a core figure in the local group, so it was easy enough for me to melt away unnoticed but one of my friends was an organizer and regular lecturer. When she finally decided to leave over exhaustion from increasingly abrasive and hostile tone of the group to any dissent, she was harassed for months. The group was seemingly offended that she rejected their brilliance and become an apostate.
I think they eventually got subsumed by the Atheism+ movement, which then imploded.
> EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything. And from there, you can justify all kinds of terrible things.
Yup.
Which is super-ironic given the association with big-R Rationality, Less Wrong, Overcoming Bias, all of which quote Feynman saying "The first principle is that you must not fool yourself, and you are the easiest person to fool."
Now I have the mental image of the scene in The Life of Brian where the crowd mindlessly parrots Brian's call for them to think for themselves.
Your point seem superficially valid, but where do we go from there?
>The worst thing about being smart is how easy it is to talk yourself into believing just about anything. After all, you make really good arguments.
>EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything. And from there, you can justify all kinds of terrible things.
Should we not talk ourselves into believing into stuff? Should smart people specifically avoid changing their beliefs out of fear of "justify all kinds of terrible things"?
>I'd love to believe in effective altruism. I already know that my money is more effective in the hands of a food bank than giving people food myself. I'd love to think that could scale. It would be great to have smarter, better-informed people vetting things. But I don't have any reason to trust them -- in part because I know too many of the type of people who get involved and aren't trustworthy.
So you don't trust donating money to food banks or malaria nets because "don't have any reason to trust them", then what? Don't donate any money at all? Give up trying to maximize impact and donate to whatever you feel like donating to?
>The worst thing about being smart is how easy it is to talk yourself into believing just about anything. After all, you make really good arguments.
>EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything.
I think that just as big of an issue is that a lot of these EA people are not really smart. They are generally financially successful, which many people confuse with being really smart. This is especially true if that financial success came as a result of being really good at something (like programming), that doesn't necessarily translate into being really good at other things, or being really smart in general. This effect is compounded when other people are constantly stoking their egos due to the aforementioned professional and financial success. This is not to say that wealthy people who go into EA are stupid (they are very likely intelligent) but they just may not be as smart as they believe they are, or as smart as other people tell them they may be.
I am out of the loop a bit, so I'll say that first. An acquaintance introduced me to the phrase effective altruism and talked about sending money to a place that's having problems rather than going there yourself, and I just thought, yah, that's a more effective way of being altruistic and went on with my day.
I'm just learning that they made it into a cult. Who thought that was a good idea? Just be an effective altruist FFS.
Mountains beyond mountains is a great book with an alternative view. Dr Paul is likely to do a good job with your money helping people because he's taking local patients and root causing their problems causing them to get sick, and solving those.
Somebody from on high isn't going to have the right perspective to do a good job. I think the effective altruists are solving a problem for themselves "how do I feel better about having all this money" and not "how do I solve this person's lack of ability to fill prescriptions problem"
My favorite aspect of EA is the transparency. Givewell and open phil publish white papers that go into excruciating detail about how they reached their conclusions. You don't have to trust them. Read what they publish and make your own conclusions about how to do good most effectively.
I've also enjoyed the writing of Julia Galef, especially The Scout Mindset, which many in the EA community have embraced as solutions to not only the "too smart for your own good" problem but general divisiveness/tribalism in society.
I wouldn't say "really-smart-person" but reasonably smart person who believe they are way more intelligent than they actually are. People who mistake competence in one area as expertise in all areas.
I am an EA (in the sense that I believe in the importance of some general principles of the movement, like trying to do the most good, and use data when available -- I don't participate in any organization directly).
What I think is the most promising feature of EA is openness to criticism and trying to improve the movement itself. I think SBF was a wake up call and many people in the movement should take note that this isn't how we get to a better future or do the most good (although I think his tactics also were not really EA canon, if there is one, and I don't know if people knew there could be outright fraud -- disclaimer, I don't know the details of SBF's case). That case highlighted the importance of following common sense ethics, honesty and so on.
It's certainly dangerous to use reasoning to justify whatever you would like to be true. I believe this is addressed in Julia's book 'The Scout Mindset' (haven't read it, only a few reviews!), and it's actually one of the main points of the book (with several examples of that) -- a.k.a. motivated reasoning, arrogance, etc.. We can't be perfect, but we can learn from our mistakes and be aware of those dangers.
I hope the EA community will have a sincere look at those issues and address whatever needs to be addressed. I myself will continue to be an EA possibly forever, because the root ideas are very sound. I want to help people, and I want to learn how to best help others. You can't make me not want to help others effectively ;)
Effective altruism is, I think, an ideology perfectly suited to ensnare a certain kind of person.
Conventional wisdom would say that wielding wealth and power like effective altruism demands requires humility, compassion, and maturity. It requires wisdom. Effective altruism can seem to remove the need for that. Doing good is about calculation, not compassion! Interpersonal failings don't matter if someone is really good with C++. One needn't care about the feelings of others if there are more efficient ways to use the time.
Effective altruism calls on the rich and capable to recognize their own power to help those who are poor and helpless. However, it is easy for pity to turn to contempt and for clarity of purpose to turn to arrogance. The poor, hungry, and sick of the world need the effective altruist for a savior. The effective altruist is better than the rest because they are making the world a better place.
An effective altruist may confuse immaturity with wisdom and greed with generosity.
This is not meant to be a diatribe. I find much of effective altruism obviously true and find my exposure to it has made me a better person. If pressed, I would probably call myself an effective altruist. Still, it is greatly concerning that people like Elon Musk or Sam Bankman-Fried can be associated with effective altruism without any real hypocrisy.
Let us take a moment and separate the idea from the community - most (I might potentially say all) communities fail to embody their own ideas. A fun one is giving any religious community 100 years and then checking in to see how they are going at avoiding scandals.
Effective altruism the ideal is very easy to defend - it is blindingly obvious that economic advancement has outperformed all attempts at charity by orders of magnitude. Asian countries keep using work-hard-and-save with jaw dropping outcomes. The success rate of capitalist entrepreneurs at driving social improvements dwarfs the attempts of charities too.
The Effective Altruist community however is probably going to devolve into something rather unimpressive. There isn't much difference between a high performing effective altruist and a successful capitalist, so the community is on shaky ground. They don't have obvious rituals or demographic pressures to keep the community congealed.
Although this headline confuses me. Don't all communities with men in them have a sexual harassment problem? I'm not recalling a community that I'm eligible to join that hasn't been labelled as having sexual harassment problems.
> The worst thing about being smart is how easy it is to talk yourself into believing just about anything. After all, you make really good arguments.
How smart you are has nothing to do with it. If you're motivated enough to believe something that you shouldn't believe, you'll talk yourself into it. Smart people might be able to come up with an objectively better (or at least more convoluted) rationalization, but only because they'll require of themselves a better rationalization. Less-smart people won't be as capable of rationalizing, but they also won't be as discerning about the quality of the rationalization. Kind of like how people develop about the right amount of muscle to carry their own weight around, whether that be 90 or 180 lbs.
> EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything.
This sounds negative until you realize this also applies to programming and litigation, chess and go, everything in academia, communism and anarcho-capitalism, etc. The fact that something appeals to know-it-alls says next to nothing about the quality of the thing. Chess isn't a worse game because it attracts insufferable braniacs and other pretentious people, it's just the community around it that suffers. But the communities around everything interesting suffer from those same sorts of people.
> And from there, you can justify all kinds of terrible things.
> Once that happens, it can easily spiral out from there. People who know perfectly well they're misbehaving will claim that they aren't, using the same arguments. It won't hold water, but now we're swamped, and the entire thing crumbles.
Once again, yes, people who want to justify terrible things can do so. It's not unique to tech bros. Fundamentalist christians, the homeless, and young activists of any alignment can do it too.
> ...in part because I know too many of the type of people who get involved and aren't trustworthy.
This is the meat; the only question that actually matters: Is it true that EA is disproportionately populated by the "untrustworthy"? If so, then there's a reason to stay away from it. This article's pair of anecdotes are worse than useless for arriving at the answer. All I know after reading it is that the TIMES is interested in taking EA down a peg. The evidence presented is utterly insufficient to support the claim that "EA has a sexual harassment problem [to an unusual--and therefor newsworthy--extent]".
It isn't about being smart. Anyone can convince themselves of anything, eg flat earthers.
>I already know that my money is more effective in the hands of a food bank than giving people food myself
In certain situations. Unless you're generally in favour of communism. Most people just skip out the barter entirely and give money, generally in exchange for things. It isn't clear to me why charity in general should be different.
If there are less cases of abuse than among generic populace, it is weird to blame the movement. From the article, it is hard to understand that EA is somehow worse than most of the USA.
Also, arguments for polyamory are just that, arguments. You certainly can press someone into it, but from the article, the impression that it is more like persuasion.
Regarding cult dynamics - any tight knit community feels like this. Be it psychedelics users, health nuts, athletes etc etc. All of these will have takes that outsiders will consider unusual and weird.
> but from the article, the impression that it is more like persuasion.
Okay -- but if you showed up to a tech conference, and someone in the hallway was trying to "persuade" you to join a threesome, would you feel that was appropriate for the setting? The issue is that so much of this relationship-and-sex talk is happening to people who didn't think they had signed up for it. That's where you start verging on abuse.
I think part of the issue is that EA is:
- kind of life-defining by nature
- filled with people who seek self-improvement
- filled with people who are excellent persuaders
With that mix, it is uncomfortably easy to start "mixing business with pleasure," so to speak. People in that environment think that they live a really interesting and good life, and want to convince others of that fact.
That's why people are blaming the movement, I think.
Having been part of the rat/EA community in the bay area for over a decade I can 100% confirm that the incidence of weird sex stuff happening is far far above of the baseline I've experienced with other groups of people, and i tend to keep odd company.
Not some sort of scientific study but also the only set of people where I've been casually asked if I wanted a prostate massage at a house party. The norms around here just hit different.
I'm not mad about it, and I think I've personally never felt harassed and in general there seems to be a very explicit consent culture, but also weird sex stuff is so normalized that it must be easier for bad actors, especially clever ones to get away with abusive behavior.
> arguments for polyamory are just that, arguments
Which have nothing whatever to do with effective altruism. So it's perfectly reasonable for a person who came to a gathering expecting to talk about effective altruism, to be uncomfortable, to say the least, when she finds herself getting proselytized about polyamory.
Arguments for polyamory are also regarded as grossly unprofessional in any environment that's focused on a specific goal. Most people don't want to be goaded by strangers into "arguing" about their relationship status. It's abusive.
> EA is diffuse and deliberately amorphous; anybody who wants to can call themselves an EA... But with no official leadership structure, no roster of who is and isn’t in the movement, and no formal process for dealing with complaints, Wise argues, it’s hard to gauge how common such issues are within EA compared to broader society.
Moral of the story: be weary of groups with low accountability and vague power structures. In a vacuum, power structures will always emerge, so it's generally better for them to exist in the light than in the dark.
I think it's bizarre EA seems to be a movement with power structures. I always just thought EA was a philosophy and based on that I felt it was an interesting idea. I don't have to worry about sexual harassment when I'm considering Plato or Stoicism. Why is it a thing with EA?
This is an aside, but although I agree that groups without formal power structures can hide real ones, I'm not sure explicit hierarchies are necessarily better. In my experience, they can be used to legitimize shadow hierarchies or corruption, which sometimes makes the problems worse. Those vague power structures exist with or without formal ones; when they coincide it's good, but when they don't, it can perpetuate or reinforce problems more than they might otherwise.
I'm not trying to defend anything about EA, though. It's always seemed somewhat suspicious to me, and there's probably a lot of ways in which it could be used as an example of phenomena that occur more broadly in society.
I see that essay linked every six months or so, and I swear every time I read it, a new element of it rings true to me. Really timeless, invaluable writing on the way groups of humans work.
Back in my day effective altruism was mainly about finding charities that aren't essentially scams (way harder than it looks). Scene has apparently moved on to other things since I followed it a decade or so ago.
It's unclear if the issue is EA or how to handle misbehavior in organizations without formal structure or hierarchy. It isn't like a workplace, with reasonably well-defined boundaries, but something more akin to religion, where its influence bleeds over heavily into many aspects of ones life. As such, it is probably both more devastating when one is the victim of misconduct and also more difficult to police such misconduct. I am not really sure what the answer here is. "Believe all women" is a great slogan, but I am not a fan of a "guilty until proven innocent" (and I say this as a woman). OTOH, this isn't a criminal procedure and as such, one shouldn't have to prove beyond a reasonable doubt that someone is preying on others to enforce some level of punishment. It's a tough problem.
You should be able to punish people even though there's reasonable doubt that they are culpable? Are you arguing for a "balance of probabilities" standard? Or that it's worth punishing some innocents so that the guilty are also punished?
For anyone that actually read the article, idk how much criticism there is for EA, but rather seemed more like a expose on how the Bay Area EA culture is heavily polyamorous with men abusing job connections to pressure women. Seems more like a Bay Area thing than an EA thing. I’m not sure though as I am not involved in any of these…
I find that the following is true: those who are very vocal about their own virtue/altruism are usually rotten.
Those who aren't, often surprise you with their goodness when it matters.
I've found the "I am so good/empathetic/etc" crowd to be either self deluded and useless, or manipulative, specifically for the purpose of trying to work your way in sexually (as in this story)
EA can be the scum of Earth and I wouldn’t be surprised that this kind of group could be as it sounds cultish and self-aggrandizing. But conflating their wrongdoings right away with the refusal of monogamy and the practice of polyamory is so annoying and conservative and plainly wrong. It also implies that these practices are one step removed from harassment, which is infuriating. It’s exactly like conflating homosexuality and pedophilia, just wrong and violent.
Shame on TIME for using this story to immediately push their conservative agenda.
This is not about effective altruism as in the generic practice of high-impact giving, it relates to a highly specific subculture of EA proponents. "Polycules", 'nuff said. These are not appropriate topics in a professional discussion among strangers.
TIL a woman even committed suicide as a result of her experiences with sexual harassment in the Effective Altruism groups she was on, according to her suicide note. [1]
Sexual harassment is one facet of a what seems to be a power problem. The power in the group derives from reputation and many are willing to go to extreme lengths to scavenge as much of it as they can.
How many times are we going to repeat Zimbardo's experiment?
Let's not get bogged down with the details of EA, which on paper seems well intentioned. It could be a book club where a sexual harassment problem emerges, if there are power discrepancies within the group.
We also need to stop acting surprised whenever there are influential and powerful people in such a group (SBF was mentioned) and something like this emerges. The powerful within the group are effectively laundering money in exchange for sexual capital. Hookup culture, which has a wider audience than many would expect, would cynically view a group like EA as a lead generator for sex. It works until at least one person starts to call foul and the shaky foundation upon which the house of cards is built is exposed.
Come to think of it, in the current zeitgeist, if you're being Machiavellian about this sort of thing, then it might be a good strategy to quietly endure the shenanigans and collect information on such a group in order to gain leverage. The payout can be substantial if it captures enough media attention.
One item caught my eye:
>>There are also cases where I find a report to be alarming and would like to take action, but the person reporting does not want it known that they spoke up. In these cases, sometimes there’s very little that can be done without breaking their confidentiality.
Yikes. That would not meet reporting standards at most organizations I've been a part of. There are a lot of hard lessons behind requiring mandatory investigation of credible claims.
> men at informal EA gatherings tried to convince her
A lot of people are focusing on the polyamory. While that may have been something that offended people or made them uncomfortable, that is not the issue.
The key issues here are around the words "informal" and "convince".
I have non-traditional views about human sexuality. Even if I had traditional views, I spend a lot of time in different cultures where the traditional views can vary quite a bit.
I also strongly believe in talking about ideas, challenging viewpoints, and being non-judgmental and open to new experiences. I talk openly about my views and experiences. Sometimes, I extend invitations to people with different values than me participate in sexual interactions.
Yet, as much as I believe in being open about my views, there are clear boundaries.
On formality: One boundary is that I never talk about sex or sexuality with professional colleagues, or when I am in a position of power. Simply put, some people are not comfortable talking about these things, and if they are not in a position where they feel they can say no or avoid it, talking about it is not ok. It is harassment to talk about sexual topics at a formal gathering. It may be ok to talk about sexual topics at an informal gathering, but if professional colleagues are there, it's wise to approach such topics cautiously or best not at all.
On convincing: it's always a good idea to accept a no gracefully and respectfully. If someone declines an invitation, don't try to convince them. If someone says they are offended by your views, don't try to change their mind. If someone seems hesitant to talk about a topic, don't continue. On the other hand, if all of the participants of a conversation are eagerly engaged and asking about different viewpoints, then it is ok to try to convince others of your viewpoint.
I think frequent problems in tech and these problems in EA arise not because people have non-traditional views or are open about sexuality. The problems happen because people in these groups tend to be a little bit on a spectrum and unaware of power dynamics, people's comfort level, and how those things can affect people's expressions or lack of expressions about consent.
Some simple rules for those in doubt:
1) Never talk about sex with professional colleagues
2) Never try to convince others of your sexual views unless they ask for your opinion
3) When in doubt, don't talk about sex
I am a firm believer in talking openly and without judgment about sexuality. But in order to do so safely, full awareness and respect for these boundaries are key.
People want sex. Sex happens most for those who are considered most socially valuable (for men anyway).
People look for socially popular things, select those that are accessible and adopt them to signal virtue and social value, to try and increase the likelihood they will appear as more sexually valuable.
However, this is risky. Discovery of deception of social value causes a much larger loss of social value than was gained from the deception in the first place, this is because it's commonly gamed for this very reason.
People who are more interested in things than people (mostly men) really struggle to grasp these interactive social properties in real time, and will often even be so unaware that they will not only openly admit to being deceptive, but get actually angry with the victim when the deception doesn't work.
These are called closed contracts, so when losers give gifts to someone in an unprompted fashion, this is not because they want to be altruistic, but because they think the altrusim can be used to present themselves as more sexually attractive. Resorting to trickery out of desperation.
These people are very common, and can often saturate movements like this.
Not to say that it isn't a problem or that the community shouldn't do better, but this seems like the sort of stuff you could dredge up about pretty much any group of comparable size. I'm intensely curious about the decision-making process that got this article published.
Seems like the issue here is not EA or a rationalist world view. The issue seems to be the lack of compartmentalization of the professional and personal lives, a lack of procedural boundaries around social conduct, and a general lack of maturity. This doesn't seem particularly rational...
"Effective Altruism" is one of those things like "All Lives Matter" where what it says on the box is not reflective of the way that people who identify with the ideology practice it.
With the apparent link between tech and EA, I think the article is correct in saying that this harrassment is a reflection of the same problem in broader tech circles. We still have a lot of work to do if we want everyone to feel comfortable in our spaces
[+] [-] jfengel|3 years ago|reply
EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything. And from there, you can justify all kinds of terrible things.
Once that happens, it can easily spiral out from there. People who know perfectly well they're misbehaving will claim that they aren't, using the same arguments. It won't hold water, but now we're swamped, and the entire thing crumbles.
I'd love to believe in effective altruism. I already know that my money is more effective in the hands of a food bank than giving people food myself. I'd love to think that could scale. It would be great to have smarter, better-informed people vetting things. But I don't have any reason to trust them -- in part because I know too many of the type of people who get involved and aren't trustworthy.
[+] [-] PragmaticPulp|3 years ago|reply
I came to the same conclusion after a group of my friends got involved with the local rationalist and EA community, though for a different reason: Their drug habits.
They believed themselves to have a better grasp on human nature and behavior than the average person, and therefore believed they were better at controlling themselves. They also had a deep contrarian bias, which turned into a belief that drugs weren’t actually as bad as the system wanted us to believe.
Combine these two factors and they convinced themselves that they could harness recreational opioid use to improve their lives, but avoid the negative consequences that “normies” suffered by doing it wrong. I remember being at a party where several of them were explaining that they were on opioids right now and tried to use the fact that nothing terrible was happening as proof that they were performing rational drug use.
Long story short, the realities of recreational opioid use caught up with them and they were blind to the warning signs due to their hubris. I intentionally drifted away from that group around that time, so I don’t know what happened to them.
I will never forget how confident they were that addiction is something that only happens to other people, not rationalists like them.
[+] [-] JohnFen|3 years ago|reply
Personally, I started "tithing" when my first business was a success. In part because it's good to help the less fortunate, but also as an ethical stance. Having a business drove home that no business can be successful without the support of the community it starts in, so it's only right to share in the rewards.
So, I give 10% back. I have rules about it:
I always give to a local group who directly helps people and who is typically overlooked for charitable giving. I get to know the group pretty well first.
I never give to any group that won't keep my identity a secret.
I never give to any group that asks me for money.
I don't always give in the form of money. Sometimes, it's in the form of my time and effort, or in material goods, etc.
I don't give to "umbrella" groups whose purpose is fundraising for a collection of other groups. This isn't because I have a problem with them, but because they're not the ones who struggle the most to get donations.
[+] [-] Mountain_Skies|3 years ago|reply
I had never been a core figure in the local group, so it was easy enough for me to melt away unnoticed but one of my friends was an organizer and regular lecturer. When she finally decided to leave over exhaustion from increasingly abrasive and hostile tone of the group to any dissent, she was harassed for months. The group was seemingly offended that she rejected their brilliance and become an apostate.
I think they eventually got subsumed by the Atheism+ movement, which then imploded.
[+] [-] ben_w|3 years ago|reply
Yup.
Which is super-ironic given the association with big-R Rationality, Less Wrong, Overcoming Bias, all of which quote Feynman saying "The first principle is that you must not fool yourself, and you are the easiest person to fool."
Now I have the mental image of the scene in The Life of Brian where the crowd mindlessly parrots Brian's call for them to think for themselves.
[+] [-] gruez|3 years ago|reply
>The worst thing about being smart is how easy it is to talk yourself into believing just about anything. After all, you make really good arguments.
>EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything. And from there, you can justify all kinds of terrible things.
Should we not talk ourselves into believing into stuff? Should smart people specifically avoid changing their beliefs out of fear of "justify all kinds of terrible things"?
>I'd love to believe in effective altruism. I already know that my money is more effective in the hands of a food bank than giving people food myself. I'd love to think that could scale. It would be great to have smarter, better-informed people vetting things. But I don't have any reason to trust them -- in part because I know too many of the type of people who get involved and aren't trustworthy.
So you don't trust donating money to food banks or malaria nets because "don't have any reason to trust them", then what? Don't donate any money at all? Give up trying to maximize impact and donate to whatever you feel like donating to?
[+] [-] StanislavPetrov|3 years ago|reply
>EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything.
I think that just as big of an issue is that a lot of these EA people are not really smart. They are generally financially successful, which many people confuse with being really smart. This is especially true if that financial success came as a result of being really good at something (like programming), that doesn't necessarily translate into being really good at other things, or being really smart in general. This effect is compounded when other people are constantly stoking their egos due to the aforementioned professional and financial success. This is not to say that wealthy people who go into EA are stupid (they are very likely intelligent) but they just may not be as smart as they believe they are, or as smart as other people tell them they may be.
[+] [-] jtode|3 years ago|reply
I'm just learning that they made it into a cult. Who thought that was a good idea? Just be an effective altruist FFS.
[+] [-] 8note|3 years ago|reply
Somebody from on high isn't going to have the right perspective to do a good job. I think the effective altruists are solving a problem for themselves "how do I feel better about having all this money" and not "how do I solve this person's lack of ability to fill prescriptions problem"
[+] [-] neilv|3 years ago|reply
What about if you're less smart? Wouldn't you be more easily fooled?
Maybe it's not about smartness, but about arrogance?
[+] [-] HDThoreaun|3 years ago|reply
My favorite aspect of EA is the transparency. Givewell and open phil publish white papers that go into excruciating detail about how they reached their conclusions. You don't have to trust them. Read what they publish and make your own conclusions about how to do good most effectively.
I've also enjoyed the writing of Julia Galef, especially The Scout Mindset, which many in the EA community have embraced as solutions to not only the "too smart for your own good" problem but general divisiveness/tribalism in society.
[+] [-] bena|3 years ago|reply
[+] [-] steveBK123|3 years ago|reply
with all the related problems & benefits.
[+] [-] gnramires|3 years ago|reply
What I think is the most promising feature of EA is openness to criticism and trying to improve the movement itself. I think SBF was a wake up call and many people in the movement should take note that this isn't how we get to a better future or do the most good (although I think his tactics also were not really EA canon, if there is one, and I don't know if people knew there could be outright fraud -- disclaimer, I don't know the details of SBF's case). That case highlighted the importance of following common sense ethics, honesty and so on.
I also think it's a very common issue for smart people to 'get over their heads' (is that a valid expression?), which I wrote about here: https://www.reddit.com/r/slatestarcodex/comments/yww9g6/be_s...
It's certainly dangerous to use reasoning to justify whatever you would like to be true. I believe this is addressed in Julia's book 'The Scout Mindset' (haven't read it, only a few reviews!), and it's actually one of the main points of the book (with several examples of that) -- a.k.a. motivated reasoning, arrogance, etc.. We can't be perfect, but we can learn from our mistakes and be aware of those dangers.
I hope the EA community will have a sincere look at those issues and address whatever needs to be addressed. I myself will continue to be an EA possibly forever, because the root ideas are very sound. I want to help people, and I want to learn how to best help others. You can't make me not want to help others effectively ;)
[+] [-] dfgheaoinbt6t|3 years ago|reply
Conventional wisdom would say that wielding wealth and power like effective altruism demands requires humility, compassion, and maturity. It requires wisdom. Effective altruism can seem to remove the need for that. Doing good is about calculation, not compassion! Interpersonal failings don't matter if someone is really good with C++. One needn't care about the feelings of others if there are more efficient ways to use the time.
Effective altruism calls on the rich and capable to recognize their own power to help those who are poor and helpless. However, it is easy for pity to turn to contempt and for clarity of purpose to turn to arrogance. The poor, hungry, and sick of the world need the effective altruist for a savior. The effective altruist is better than the rest because they are making the world a better place.
An effective altruist may confuse immaturity with wisdom and greed with generosity.
This is not meant to be a diatribe. I find much of effective altruism obviously true and find my exposure to it has made me a better person. If pressed, I would probably call myself an effective altruist. Still, it is greatly concerning that people like Elon Musk or Sam Bankman-Fried can be associated with effective altruism without any real hypocrisy.
[+] [-] toss1|3 years ago|reply
Voltaire put it most succinctly:
"Those Who Can Make You Believe Absurdities, Can Make You Commit Atrocities"
And, as Feynman pointed out:
“The first principle is that you must not fool yourself and you are the easiest person to fool.”
So, yes, this works just as well recursively, i.e., if you are making yourself believe absurdities...
[+] [-] roenxi|3 years ago|reply
Effective altruism the ideal is very easy to defend - it is blindingly obvious that economic advancement has outperformed all attempts at charity by orders of magnitude. Asian countries keep using work-hard-and-save with jaw dropping outcomes. The success rate of capitalist entrepreneurs at driving social improvements dwarfs the attempts of charities too.
The Effective Altruist community however is probably going to devolve into something rather unimpressive. There isn't much difference between a high performing effective altruist and a successful capitalist, so the community is on shaky ground. They don't have obvious rituals or demographic pressures to keep the community congealed.
Although this headline confuses me. Don't all communities with men in them have a sexual harassment problem? I'm not recalling a community that I'm eligible to join that hasn't been labelled as having sexual harassment problems.
[+] [-] throwanem|3 years ago|reply
[+] [-] rcoveson|3 years ago|reply
How smart you are has nothing to do with it. If you're motivated enough to believe something that you shouldn't believe, you'll talk yourself into it. Smart people might be able to come up with an objectively better (or at least more convoluted) rationalization, but only because they'll require of themselves a better rationalization. Less-smart people won't be as capable of rationalizing, but they also won't be as discerning about the quality of the rationalization. Kind of like how people develop about the right amount of muscle to carry their own weight around, whether that be 90 or 180 lbs.
> EA appeals to exactly that kind of really-smart-person who is perfectly capable of convincing themselves that they're always right about everything.
This sounds negative until you realize this also applies to programming and litigation, chess and go, everything in academia, communism and anarcho-capitalism, etc. The fact that something appeals to know-it-alls says next to nothing about the quality of the thing. Chess isn't a worse game because it attracts insufferable braniacs and other pretentious people, it's just the community around it that suffers. But the communities around everything interesting suffer from those same sorts of people.
> And from there, you can justify all kinds of terrible things.
> Once that happens, it can easily spiral out from there. People who know perfectly well they're misbehaving will claim that they aren't, using the same arguments. It won't hold water, but now we're swamped, and the entire thing crumbles.
Once again, yes, people who want to justify terrible things can do so. It's not unique to tech bros. Fundamentalist christians, the homeless, and young activists of any alignment can do it too.
> ...in part because I know too many of the type of people who get involved and aren't trustworthy.
This is the meat; the only question that actually matters: Is it true that EA is disproportionately populated by the "untrustworthy"? If so, then there's a reason to stay away from it. This article's pair of anecdotes are worse than useless for arriving at the answer. All I know after reading it is that the TIMES is interested in taking EA down a peg. The evidence presented is utterly insufficient to support the claim that "EA has a sexual harassment problem [to an unusual--and therefor newsworthy--extent]".
[+] [-] benj111|3 years ago|reply
>I already know that my money is more effective in the hands of a food bank than giving people food myself
In certain situations. Unless you're generally in favour of communism. Most people just skip out the barter entirely and give money, generally in exchange for things. It isn't clear to me why charity in general should be different.
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] twblalock|3 years ago|reply
[+] [-] kazinator|3 years ago|reply
[deleted]
[+] [-] theragra|3 years ago|reply
Also, arguments for polyamory are just that, arguments. You certainly can press someone into it, but from the article, the impression that it is more like persuasion.
Regarding cult dynamics - any tight knit community feels like this. Be it psychedelics users, health nuts, athletes etc etc. All of these will have takes that outsiders will consider unusual and weird.
[+] [-] ketzo|3 years ago|reply
Okay -- but if you showed up to a tech conference, and someone in the hallway was trying to "persuade" you to join a threesome, would you feel that was appropriate for the setting? The issue is that so much of this relationship-and-sex talk is happening to people who didn't think they had signed up for it. That's where you start verging on abuse.
I think part of the issue is that EA is:
- kind of life-defining by nature
- filled with people who seek self-improvement
- filled with people who are excellent persuaders
With that mix, it is uncomfortably easy to start "mixing business with pleasure," so to speak. People in that environment think that they live a really interesting and good life, and want to convince others of that fact.
That's why people are blaming the movement, I think.
[+] [-] thot_experiment|3 years ago|reply
Not some sort of scientific study but also the only set of people where I've been casually asked if I wanted a prostate massage at a house party. The norms around here just hit different.
I'm not mad about it, and I think I've personally never felt harassed and in general there seems to be a very explicit consent culture, but also weird sex stuff is so normalized that it must be easier for bad actors, especially clever ones to get away with abusive behavior.
[+] [-] pdonis|3 years ago|reply
Which have nothing whatever to do with effective altruism. So it's perfectly reasonable for a person who came to a gathering expecting to talk about effective altruism, to be uncomfortable, to say the least, when she finds herself getting proselytized about polyamory.
[+] [-] zozbot234|3 years ago|reply
[+] [-] supersour|3 years ago|reply
This passage reminded me of this article: https://www.jofreeman.com/joreen/tyranny.htm
Moral of the story: be weary of groups with low accountability and vague power structures. In a vacuum, power structures will always emerge, so it's generally better for them to exist in the light than in the dark.
[+] [-] Sakos|3 years ago|reply
[+] [-] derbOac|3 years ago|reply
I'm not trying to defend anything about EA, though. It's always seemed somewhat suspicious to me, and there's probably a lot of ways in which it could be used as an example of phenomena that occur more broadly in society.
[+] [-] ketzo|3 years ago|reply
[+] [-] willcipriano|3 years ago|reply
[+] [-] sonjat|3 years ago|reply
[+] [-] Y_Y|3 years ago|reply
[+] [-] broof|3 years ago|reply
[+] [-] xyzelement|3 years ago|reply
Those who aren't, often surprise you with their goodness when it matters.
I've found the "I am so good/empathetic/etc" crowd to be either self deluded and useless, or manipulative, specifically for the purpose of trying to work your way in sexually (as in this story)
[+] [-] camillomiller|3 years ago|reply
[+] [-] eddsh1994|3 years ago|reply
[+] [-] zozbot234|3 years ago|reply
[+] [-] ChrisMarshallNY|3 years ago|reply
A lot of what went on, in those days, would be considered rape, slavery, various types of coercion and larceny, etc., these days.
[+] [-] marc_abonce|3 years ago|reply
[1] Kathy Forth's suicide note: https://medium.com/@itai.ilyich/if-i-cant-have-me-no-one-can...
[+] [-] antisceptic|3 years ago|reply
How many times are we going to repeat Zimbardo's experiment?
Let's not get bogged down with the details of EA, which on paper seems well intentioned. It could be a book club where a sexual harassment problem emerges, if there are power discrepancies within the group.
We also need to stop acting surprised whenever there are influential and powerful people in such a group (SBF was mentioned) and something like this emerges. The powerful within the group are effectively laundering money in exchange for sexual capital. Hookup culture, which has a wider audience than many would expect, would cynically view a group like EA as a lead generator for sex. It works until at least one person starts to call foul and the shaky foundation upon which the house of cards is built is exposed.
Come to think of it, in the current zeitgeist, if you're being Machiavellian about this sort of thing, then it might be a good strategy to quietly endure the shenanigans and collect information on such a group in order to gain leverage. The payout can be substantial if it captures enough media attention.
[+] [-] lbwtaylor|3 years ago|reply
One item caught my eye: >>There are also cases where I find a report to be alarming and would like to take action, but the person reporting does not want it known that they spoke up. In these cases, sometimes there’s very little that can be done without breaking their confidentiality.
Yikes. That would not meet reporting standards at most organizations I've been a part of. There are a lot of hard lessons behind requiring mandatory investigation of credible claims.
[+] [-] yosito|3 years ago|reply
A lot of people are focusing on the polyamory. While that may have been something that offended people or made them uncomfortable, that is not the issue.
The key issues here are around the words "informal" and "convince".
I have non-traditional views about human sexuality. Even if I had traditional views, I spend a lot of time in different cultures where the traditional views can vary quite a bit.
I also strongly believe in talking about ideas, challenging viewpoints, and being non-judgmental and open to new experiences. I talk openly about my views and experiences. Sometimes, I extend invitations to people with different values than me participate in sexual interactions.
Yet, as much as I believe in being open about my views, there are clear boundaries.
On formality: One boundary is that I never talk about sex or sexuality with professional colleagues, or when I am in a position of power. Simply put, some people are not comfortable talking about these things, and if they are not in a position where they feel they can say no or avoid it, talking about it is not ok. It is harassment to talk about sexual topics at a formal gathering. It may be ok to talk about sexual topics at an informal gathering, but if professional colleagues are there, it's wise to approach such topics cautiously or best not at all.
On convincing: it's always a good idea to accept a no gracefully and respectfully. If someone declines an invitation, don't try to convince them. If someone says they are offended by your views, don't try to change their mind. If someone seems hesitant to talk about a topic, don't continue. On the other hand, if all of the participants of a conversation are eagerly engaged and asking about different viewpoints, then it is ok to try to convince others of your viewpoint.
I think frequent problems in tech and these problems in EA arise not because people have non-traditional views or are open about sexuality. The problems happen because people in these groups tend to be a little bit on a spectrum and unaware of power dynamics, people's comfort level, and how those things can affect people's expressions or lack of expressions about consent.
Some simple rules for those in doubt:
1) Never talk about sex with professional colleagues 2) Never try to convince others of your sexual views unless they ask for your opinion 3) When in doubt, don't talk about sex
I am a firm believer in talking openly and without judgment about sexuality. But in order to do so safely, full awareness and respect for these boundaries are key.
[+] [-] bloqs|3 years ago|reply
People want sex. Sex happens most for those who are considered most socially valuable (for men anyway).
People look for socially popular things, select those that are accessible and adopt them to signal virtue and social value, to try and increase the likelihood they will appear as more sexually valuable.
However, this is risky. Discovery of deception of social value causes a much larger loss of social value than was gained from the deception in the first place, this is because it's commonly gamed for this very reason.
People who are more interested in things than people (mostly men) really struggle to grasp these interactive social properties in real time, and will often even be so unaware that they will not only openly admit to being deceptive, but get actually angry with the victim when the deception doesn't work.
These are called closed contracts, so when losers give gifts to someone in an unprompted fashion, this is not because they want to be altruistic, but because they think the altrusim can be used to present themselves as more sexually attractive. Resorting to trickery out of desperation.
These people are very common, and can often saturate movements like this.
[+] [-] ThrustVectoring|3 years ago|reply
[+] [-] svc0|3 years ago|reply
[+] [-] cwkoss|3 years ago|reply
[+] [-] scaredginger|3 years ago|reply