Product Person #1:
Hey, here’s a writeup that says if we show too many positive posts to people that it creates bad feelings. Should we show less positive posts?
Product Person #2:
Not sure, wouldn’t negative posts make people feel worse?
Product Person #1:
Wow, I donno, maybe we should try adjusting additional posts people see one way a bit and see what happens?
Product Person #2:
Not a bad idea, how about we try both? You know, an A/B test.
Product Person #1:
Hmmm. Ok, but— when we’re done, let’s have some scientific review of our data just so that we can correct the record and push along the science around this stuff.
Journalist:
This company deliberately made people sad
That's the definition of an unethical experiment design and would be considered academic misconduct if that were academic work. Regardless of intent, fb probably shouldn't be allowing front end devs to design psychological experiments at mass scale.
The constant derision of journalists in tech circles (especially on HN) is kind of shocking to see. Did you read the article or did you miss that the journalist is relaying fellow researchers' apprehensiveness about Facebook being allowed to conduct an unethical study. This article [1] is linked in the second paragraph.
I admit I didn’t read the article because I thought I knew the event it was referring to, but didn’t Facebook actually publish this as a psychology study? i.e. they didn’t just use it for A/B testing features but actually thought they were doing something “good” to the point of publishing a study about it. It’s laughable now lol.
It seems like you’re implying that, by explaining in slightly more detail how this might have happened, you somehow show that the journalist was oversimplifying or distorting things. But, no, your last sentence is definitely still correct.
I still don't see why I should be outraged over this. A user of facebook already consents to facebook displaying whatever content on their feed facebook chooses. Why should facebook require extra consent to gather scientific evidence on the effects of the content being displayed, given they are only analyzing data they already gather?
To be fair, since this has been a while ago it's hard to tell what to do about this. Personally I find it hard to draw any conclusions from this just due to the time that has passed. I don't use Facebook, so maybe it's just my distance from it. But it has happened and it's worth stating that this is basically a psychological experiment and not a simple A/B test, but at a company that most likely at the time didn't have an ethics board to review this.
Other sources list a couple of principles behind the ethics of psychological research. The relevant ones being:
- Minimise the risk of harm
- Obtain informed consent
Some of them do state that the latter isn't always exactly possible, since that may influence the outcome.
But the fact of the matter is that Facebook did an A/B test that could inflect serious harm on the quality of life of the participants, who weren't aware of any research being conducted. The latter sounds like it'd be at least the minimum here.
So, I'm not a psychologist, but this does sound like it shouldn't have happened in this way. There were definitely more ethical ways in running this experiment that wouldn't have involved 700K unknowing and potentially unwilling participants.
Because "consent" is worthless if it is not informed consent. And it's neither expected by, nor in the interest of the user to be emotionally manipulated.
By the way, the whole "intelligent" newsfeed is likely not expected either. My guess is, most people sign up to Facebook with the expectation to see the update of all their friends, nit be subject to Facebook's AI games and product research.
This is unethical psychological research against non-consenting adults and children that likely caused real harm. From a paper cited below,
> This Facebook study was conducted without consent and without appropriate oversight, and may have harmed both participants and non-participants. Kramer’s apology also puts the vast number of participants in context; a full 0.04% of Facebook’s users, or 1 in 2500, were unwitting subjects in this unethical research. Many of these people were almost certainly children, and many of the participants were probably suffering from depression. It is surprising and worrying that one of the world’s most prominent companies should treat both the emotions of its users and research ethics so carelessly. Steps must be taken to ensure that international psychological and medical studies involving social network users are regulated to the same standard as other human subjects research.
> given they are only analyzing data they already gather?
I'm not sure what this means. But they did not only analyze existing data, they created new data specifically intended to make people feel sad, then analyzed that data.
How sad did this experiment really make people? The chart in the study says that people who saw fewer positive posts used one percent fewer positive words and about 0.3% more negative words. But on the other hand, they were seeing fewer positive posts, so maybe they were just replying to the posts they saw in a way that was natural, without their inner emotional state being very affected.
As a former FB employee who worked on ranking, I just want to point out how absolutely tiny the effect size is here when we think about normal human emotional experience. On a per person basis, this experiment had an effect that is comparable to other potential changes like slightly altering the padding on the "like" button, or showing 1 extra news post per day, or sending an extra notification once per month.
Facebook didn't make people sad. It made a population post things with slightly more negative words, only significant when measured across hundreds of thousands of users.
This was the last straw for me, thats when I permabanned facebook from my life.
This was an unauthorized, unguided, unethical, mass psychological experiment on human beings. Anyone involved should have gone to jail for crimes against humanity.
People keep using the words "psychological experiment" as if it proves the point. No, its just a more emotional description of what happened. Why, exactly, is controlled manipulation to gather scientific data require extra consent than doing random manipulation they already have consent for?
No it's not a crime against humanity, let's not go that far and let's not unreasonably promote cancel culture by burning the witch at the stake before giving the issue some deep thought.
It's a much more complicated issue then you describe. Many corporate organizations have been selectively distributing information to deliberately influence sentiment for decades. Facebook is not the first nor will they be the last.
Restricting any entity from doing this is a complicated issue that has to do with restricting freedom of speech.
Fox news comes to mind when I think of another organization that does this deliberately. To add more complication to the issue one should consider the fact that the success of Fox news can also be attributed to it's customers: people. People choose what they want to hear, and many people prefer news viewed through a biased right leaning lens.
Just like how the above poster wants to paint this issue as a crime against humanity, I think part of the problem is that on some level we all want to lie to ourselves. We want to view the world in a very specific and certain black and white way.
I admit confusion. isn't unsolicited experimental psychological manipulation without consent just another word for most modern marketing?
I mean don't get me wrong, I dont like it and try to exclude it from my life and my families life, but there's pretty wide acceptance socially for this type of behaviour.
I would of thought the beauty industry would be an old perpetrator that should generally be investigated. going to shut down that?
and negatively instilling fear and distrust in a population without their consent for personal gain is about as old as politics itself?
isn't this practically mainstream media behaviour?
fomo? status anxiety? conspicuous consumption? I don't see why Facebook should be singled out for society-wide mandated and culturally supported practices.
Honestly that sounds like “this stuff dealer is bad because they make people sad while they wait for a dose; I quit, clean for 2d 6h 19min”. No doubt facebook is evil, corporate monster, etc, but hey what about stopping being a junkie. The problem is not someone experimenting with your unhealthy addiction, it is your unhealthy addiction.
At the early stages of the internet, most of the content was hidden, waiting for you to actively discover and bookmark it. Like you do with good places in your town - you find one, add it to your address book and visit occasionally. It was a slow process, full of findings, enjoyment and variety. Now everyone seems to sit at their mailbox, desperately waiting for another pack of junk mail to arrive. Facebook is just that - a postman who chooses from a variety of crap to push into your inbox. It doesn’t change lives unless people are too lazy to live by themselves.
Is it against the rules for me to use HN as a dating site? I'm going to pen-test it:
I enjoy cuddling, long walks on the beach, and services that do not run social experiments on me like something out of a cheap movie plot about a mad-scientist.
to contact this user please dial "jancsika" on your rotary phone now
> But the issue of consent also doesn't quite explain why we're comfortable with some types of uninformed research on us, but not others. Like almost every major tech firm, Facebook practices A/B testing
Are "we" actually comfortable with that practice? (And who is "we" in the first place? The general public? Tech people? Journalists?)
It seems to me, this is simply something the industry does because it can get away with it - and most users don't object because they don't even know it's happening.
Why tech journalists see the Facebook thing as objectionable but A/B tests without consent as perfectly fine might be a question worth discussing.
Facebook has indeed done much harm to individuals and the society as a whole. But at the same time, their tenacity to continue making money is impressive. Villiams too have some quaint evil power.
I don’t know anyone that uses Facebook anymore that likes it. Everyone I know who uses it says, “I’m thinking I should delete it soon”. Universally, the number one criticism is, “All I ever wanted to see are my friends’ posts and every update shows me less and less of those”.
Does anyone actually know people who avidly use and love Facebook? It seems like Facebook is like the Christian church where the church and everyone says they go every Sunday but it’s really more like once a year at this point.
Same boat here. Wife deleted her account. I keep mine only to use the market (currently, the best place to sell bicycle parts locally). I'll probably delete my account once I do my next round of parts bin purging.
"News media deliberately make people outraged. This ought to be the final straw".
I don't have issues with the study in general. You do want to know whether and how you can influence people in a positive or negative way, especially if you want to avoid it. There's really no other way to find out than to study it. They should've gotten clear consent for participation in that study, but that's about it from my point of view.
"There was nothing wrong except for the lack of consent" is really rather missing the point... In other contexts that's the difference between acceptable behavior and a felony.
[+] [-] antiterra|4 years ago|reply
Product Person #2: Not sure, wouldn’t negative posts make people feel worse?
Product Person #1: Wow, I donno, maybe we should try adjusting additional posts people see one way a bit and see what happens?
Product Person #2: Not a bad idea, how about we try both? You know, an A/B test.
Product Person #1: Hmmm. Ok, but— when we’re done, let’s have some scientific review of our data just so that we can correct the record and push along the science around this stuff.
Journalist: This company deliberately made people sad
[+] [-] thatcat|4 years ago|reply
https://psychology.wikia.org/wiki/Experimental_ethics
[+] [-] typon|4 years ago|reply
[1] https://www.theguardian.com/technology/2014/jun/30/facebook-...
[+] [-] chatmasta|4 years ago|reply
[+] [-] tshaddox|4 years ago|reply
[+] [-] Graffur|4 years ago|reply
[+] [-] mhh__|4 years ago|reply
Which would be correct, surely?
[+] [-] hackinthebochs|4 years ago|reply
[+] [-] philplckthun|4 years ago|reply
Other sources list a couple of principles behind the ethics of psychological research. The relevant ones being:
- Minimise the risk of harm - Obtain informed consent
Some of them do state that the latter isn't always exactly possible, since that may influence the outcome.
But the fact of the matter is that Facebook did an A/B test that could inflect serious harm on the quality of life of the participants, who weren't aware of any research being conducted. The latter sounds like it'd be at least the minimum here.
So, I'm not a psychologist, but this does sound like it shouldn't have happened in this way. There were definitely more ethical ways in running this experiment that wouldn't have involved 700K unknowing and potentially unwilling participants.
[+] [-] xg15|4 years ago|reply
By the way, the whole "intelligent" newsfeed is likely not expected either. My guess is, most people sign up to Facebook with the expectation to see the update of all their friends, nit be subject to Facebook's AI games and product research.
[+] [-] cryptoz|4 years ago|reply
> This Facebook study was conducted without consent and without appropriate oversight, and may have harmed both participants and non-participants. Kramer’s apology also puts the vast number of participants in context; a full 0.04% of Facebook’s users, or 1 in 2500, were unwitting subjects in this unethical research. Many of these people were almost certainly children, and many of the participants were probably suffering from depression. It is surprising and worrying that one of the world’s most prominent companies should treat both the emotions of its users and research ethics so carelessly. Steps must be taken to ensure that international psychological and medical studies involving social network users are regulated to the same standard as other human subjects research.
https://journals.sagepub.com/doi/10.1177/1747016115579535
Edit: Also you said something that confused me,
> given they are only analyzing data they already gather?
I'm not sure what this means. But they did not only analyze existing data, they created new data specifically intended to make people feel sad, then analyzed that data.
[+] [-] fogof|4 years ago|reply
[+] [-] seattle_spring|4 years ago|reply
[+] [-] imwillofficial|4 years ago|reply
[+] [-] mgraczyk|4 years ago|reply
Facebook didn't make people sad. It made a population post things with slightly more negative words, only significant when measured across hundreds of thousands of users.
[+] [-] yeuxardents|4 years ago|reply
This was an unauthorized, unguided, unethical, mass psychological experiment on human beings. Anyone involved should have gone to jail for crimes against humanity.
[+] [-] hackinthebochs|4 years ago|reply
[+] [-] mgraczyk|4 years ago|reply
Is that something engineers and data scientists should be imprisoned over? I think most US politicians cause bigger effects every time they tweet.
[+] [-] jbverschoor|4 years ago|reply
[+] [-] Google234|4 years ago|reply
[+] [-] neonological|4 years ago|reply
It's a much more complicated issue then you describe. Many corporate organizations have been selectively distributing information to deliberately influence sentiment for decades. Facebook is not the first nor will they be the last.
Restricting any entity from doing this is a complicated issue that has to do with restricting freedom of speech.
Fox news comes to mind when I think of another organization that does this deliberately. To add more complication to the issue one should consider the fact that the success of Fox news can also be attributed to it's customers: people. People choose what they want to hear, and many people prefer news viewed through a biased right leaning lens.
Just like how the above poster wants to paint this issue as a crime against humanity, I think part of the problem is that on some level we all want to lie to ourselves. We want to view the world in a very specific and certain black and white way.
[+] [-] ACow_Adonis|4 years ago|reply
I mean don't get me wrong, I dont like it and try to exclude it from my life and my families life, but there's pretty wide acceptance socially for this type of behaviour.
I would of thought the beauty industry would be an old perpetrator that should generally be investigated. going to shut down that?
and negatively instilling fear and distrust in a population without their consent for personal gain is about as old as politics itself?
isn't this practically mainstream media behaviour?
fomo? status anxiety? conspicuous consumption? I don't see why Facebook should be singled out for society-wide mandated and culturally supported practices.
[+] [-] niyikiza|4 years ago|reply
[+] [-] wruza|4 years ago|reply
At the early stages of the internet, most of the content was hidden, waiting for you to actively discover and bookmark it. Like you do with good places in your town - you find one, add it to your address book and visit occasionally. It was a slow process, full of findings, enjoyment and variety. Now everyone seems to sit at their mailbox, desperately waiting for another pack of junk mail to arrive. Facebook is just that - a postman who chooses from a variety of crap to push into your inbox. It doesn’t change lives unless people are too lazy to live by themselves.
[+] [-] xg15|4 years ago|reply
Why is it my unhealthy addiction if a multi-billion dollar company uses all tools at their disposal to push me into said addiction?
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] jancsika|4 years ago|reply
I enjoy cuddling, long walks on the beach, and services that do not run social experiments on me like something out of a cheap movie plot about a mad-scientist.
to contact this user please dial "jancsika" on your rotary phone now
[+] [-] alfl|4 years ago|reply
[+] [-] neonological|4 years ago|reply
[+] [-] xg15|4 years ago|reply
Are "we" actually comfortable with that practice? (And who is "we" in the first place? The general public? Tech people? Journalists?)
It seems to me, this is simply something the industry does because it can get away with it - and most users don't object because they don't even know it's happening.
Why tech journalists see the Facebook thing as objectionable but A/B tests without consent as perfectly fine might be a question worth discussing.
[+] [-] sidcool|4 years ago|reply
[+] [-] jimbob45|4 years ago|reply
Does anyone actually know people who avidly use and love Facebook? It seems like Facebook is like the Christian church where the church and everyone says they go every Sunday but it’s really more like once a year at this point.
[+] [-] alistairSH|4 years ago|reply
[+] [-] dhosek|4 years ago|reply
[+] [-] luckylion|4 years ago|reply
I don't have issues with the study in general. You do want to know whether and how you can influence people in a positive or negative way, especially if you want to avoid it. There's really no other way to find out than to study it. They should've gotten clear consent for participation in that study, but that's about it from my point of view.
[+] [-] seoaeu|4 years ago|reply
[+] [-] vmception|4 years ago|reply
[+] [-] rationalfaith|4 years ago|reply
[deleted]
[+] [-] imwillofficial|4 years ago|reply
[+] [-] alfl|4 years ago|reply
[+] [-] cryptoz|4 years ago|reply
[+] [-] unknown|4 years ago|reply
[deleted]