top | item 26847397

(no title)

philplckthun | 4 years ago

To be fair, since this has been a while ago it's hard to tell what to do about this. Personally I find it hard to draw any conclusions from this just due to the time that has passed. I don't use Facebook, so maybe it's just my distance from it. But it has happened and it's worth stating that this is basically a psychological experiment and not a simple A/B test, but at a company that most likely at the time didn't have an ethics board to review this.

Other sources list a couple of principles behind the ethics of psychological research. The relevant ones being:

- Minimise the risk of harm - Obtain informed consent

Some of them do state that the latter isn't always exactly possible, since that may influence the outcome.

But the fact of the matter is that Facebook did an A/B test that could inflect serious harm on the quality of life of the participants, who weren't aware of any research being conducted. The latter sounds like it'd be at least the minimum here.

So, I'm not a psychologist, but this does sound like it shouldn't have happened in this way. There were definitely more ethical ways in running this experiment that wouldn't have involved 700K unknowing and potentially unwilling participants.

discuss

order

emodendroket|4 years ago

Let's imagine hypothetically that sad, negative posts get more engagement by whatever metric Facebook uses, and Facebook was paying no attention to sentiments at all and ending up putting more sad posts on feeds. Would that have been unethical? I can't really see what would be so different.

alfl|4 years ago

Is it unethical to create an automated system that maximizes global unhappiness for profit?

tobr|4 years ago

Yes, that would be deeply unethical. And to make matters worse, I believe that’s a fairly accurate description of how Facebook works.