top | item 24474343

A whistleblower says Facebook ignored global political manipulation

434 points| contemporary343 | 5 years ago |buzzfeednews.com | reply

200 comments

order
[+] cabaalis|5 years ago|reply
None of this would be important if social media gave us what they originally sold us: See updates from your friends, family, and people you want to see updates from, in chronological order, rather than based upon weird engagement algorithms and privacy-destroying ad networks.
[+] Consultant32452|5 years ago|reply
Why would they give up control of the world by doing something silly like that? Think about how much political influence Twitter has based solely on which tweets they show the President and corporate press. Consider how much untraced in-kind donations these companies can make by tweaking which news stories you see. The crazy thing about it is these things can be tweaked by humans, but it's largely controlled by AI now, which no one person will completely understand what's happening in any of these systems. We're in the early stages of AI controlling the global political future and it will tend to create whatever kind of future generates the most clicks. It's kind of like the game Universal Paperclips, except with clicks/rage/ads.
[+] toeget|5 years ago|reply
This is exactly what I'm getting from twitter. My feed are the latest updates shared by the people I follow. I don't see clickbait, I don't see outrage, I have a curated feed aligned with my interests. It's true that about once in a couple of months Twitter decides to switch the order of tweets from the latest to the most recommended, but that's easily fixed in two clicks. The moment twitter removes the option to see the feed in chronological order is the moment I delete twitter.
[+] sriku|5 years ago|reply
On a slight tangent ... the one thing about HN that lets me breathe is that the links that turn up are the same for everyone on the planet and is not "personalized". If it were, I'd be gone in a jiffy.

Maybe we should rename "personalization" to something with a -ve connotation - perhaps "bubblification"? "narcissization"? "comfortzoned"?

[+] RandallBrown|5 years ago|reply
Facebook had this but everyone decided to friend every person they ever came into contact with, leading to an unmanageable stream of nonsense. Then they introduced the algorithmic feed and it became more manageable.

I personally was fine curating my feed and only friending people I wanted to follow, but that's just not how it came to work socially and culturally.

[+] save_ferris|5 years ago|reply
If only Facebook chose to make less money...

Seriously though, their choices to run the platform the way they have were fundamentally shaped by profit and the stock market. The type of corporate moderation you’re suggesting doesn’t exist.

[+] hash872|5 years ago|reply
What I've long wondered is- would it really be that hard/expensive to build an open source social media alternative that does exactly that? Updates & photos from friends & family, in chronological order, and little else. Social media has been around for a while now, I have to imagine that most of the hard problems around a customized feed and so on have been solved. Probably some idealistic ex-FB and IG engineers would join on, so we'd have their domain expertise. I bet some prominent, wealthy anti-FB types could kick in some seed money to get it off the ground. It could be set up as a non-profit, B corp or foundation of some sort.... You could run non-targeted display ads for brand advertising to help cover costs, with the added lure for advertisers that the site would be brand-friendly because it's non-controversial.

Of course it wouldn't have sophisticated features like photo tagging and such, and probably wouldn't be Hip And Cool for Gen Z, but it could be a functional bare-bones Facebook replacement. You'd have probably have to disable virality features, and maybe linking to external news sites just to prevent your racist uncle from posting Breitbart links, I don't know.

Would that really be so hard? Or do the servers, hosting, security and moderation costs just scale exponentially after some threshold of say 10 million users or what have you? Supposedly Instagram was running with a very small team when Facebook acquired them

[+] actuator|5 years ago|reply
I think recommended order has its own place. If I go to a social media platform after a week and my friends have posted about 200 posts since then, will I be interested in every post equally? Isn't it better to give me what I will probably like at the top part of the list.
[+] ChrisLomont|5 years ago|reply
>See updates from your friends, family, and people you want to see updates from

Those same people are often the ones sending misinformation and creating the problems. How do you allow people to post updates and not allow them to spread misinformation that damages society?

[+] throwaway_pdp09|5 years ago|reply
If people were willing to pay, "privacy-destroying ad networks" which need to be fed by "weird engagement algorithms" wouldn't be necessary, but people demonstrably won't. Ergo...
[+] skybrian|5 years ago|reply
Blaming viral content on bad algorithms is naive. All that’s needed for fake content to spread everywhere unchecked are a few bad actors, group messages, and forwarding (the reshare button). In some cases this results in genocide [1]. No fancy algorithms are necessary to get exponential spread of rumors. Friends and family will spread any memes that confirm their biases themselves.

To prevent this from happening, it has to be actively suppressed, or at least there needs to be something slowing it down so it dies off. A hands-off attitude isn’t going to do it.

[1] https://gizmodo.com/facebook-still-working-on-the-whole-geno...

[+] stiglitz|5 years ago|reply
They didn’t sell it to you, is the thing.
[+] bostik|5 years ago|reply
I am surprised this segment (admittedly picked from Ars's secondary writeup) hasn't made a splash:

"It's why I've seen priorities of escalations shoot up when others start threatening to go to the press, and why I was informed by a leader in my organization that my civic work was not impactful under the rationale that if the problems were meaningful they would have attracted attention, became a press fire, and convinced the company to devote more attention to the space," Zhang wrote.

That is a damage control role. Perhaps more tellingly, it highlights the entire organisation's priorities: if it isn't drawing press attention, ignore it. Of course that's not the phrase FB would use in a press release. They'd deploy a convenient euphemism, such as "dedicate the resources elsewhere".

[+] jacquesm|5 years ago|reply
Facebook is rotten from the core because it is rotten at the head. It is a very sad state of affairs that the one company that would have been able to be a force for good in all this ended up being run by someone who is so morally disconnected.

At this point in time it will take an act of God to fix it, the lock-in is very strong and their ability to buy up anything that even begins to compete with them serves to cement that lock-in to the point where I don't see a way of ever dislodging it.

[+] catsarebetter|5 years ago|reply
I do think that's a poor way to handle the situation. Playing Devil's Advocate here, but is there a better way for that organization, in their fragile political position and immense power in the world, to handle situations like that? The sheer amount of bandwidth to moderate and manage every piece of data responsibly is so vast, there has to be a way of ranking content and trends and to allocate resources to them.

It's not morally sound to have an organization optimize for public image, but they are THE public image platform. Optimizing for social good would be so much better but that's really hard to track and quantify and it can be so divisive on some topics. Esp. now that trends marked "social good" are so quickly developed and redeveloped as other things.

[+] Thorrez|5 years ago|reply
Wow, that creates a really weird incentive for him. He'll get paid more and promoted if he secretly becomes a whistleblower.
[+] pjc50|5 years ago|reply
Pretty much every organisation behaves like that for things that don't threaten the organisation itself. After all, if someone gets killed by a Facebook mob, all that happens to Facebook is the user engagement numbers go down by one.
[+] aahortwwy|5 years ago|reply
The pattern of hiring young, passionate, ambitious workers, then telling them their job is of critical importance to the company (and, in this case, society at large) while simultaneously underfunding their team and providing them with completely inadequate leadership is REALLY common in Silicon Valley companies. These same companies will actively stigmatize saying "it's not my job," and so you have very green employees who end up doing work that's wildly outside their zones of competence and comfort, internalizing all the stress that builds up along with being put in that position and not even understanding that speaking up is an option.

Many of these people lack the experience required to see the forest for the trees and they draw similar conclusions to the ones in this memo. "There's no bad intent, we're just overworked and underresourced" (paraphrased) is something I've heard time and time again from people working on supposedly important problems at companies making money hand over fist.

[+] catsarebetter|5 years ago|reply
Companies do this a lot with college grads, they sell them on a vision that they will have high impact and an important role in order to get them into their hiring funnel. It's not necessarily an operational failure more than it is a sleazy marketing tactic to prey on the lack of information by young ambitious people. Though it also results in an operational failure and a terrible waste of young talent.
[+] an_opabinia|5 years ago|reply
Once someone has been anointed a "whistleblower," it is a bad look for you to try to play devil's advocate to whatever she's saying.

Stepping back, without a media circus, how really do you expect to change anything at organizations this large and powerful? Facebook transcends governments dude, Mark Zuckerberg has an unfathomable amount of money and power. At least give her some credit for putting out a non-conformist opinion.

Also, with regards to your specific points, everyone is qualified to determine political bots are bad. I can't believe you're going with the, "Well she is missing the nuance oh and she gets paid a lot of money so there!" take here.

[+] paulcole|5 years ago|reply
Don’t forget about getting them hooked on the money and perks.
[+] catsarebetter|5 years ago|reply
This is pretty frustrating, clearly she said that she wanted her privacy respected, they even acknowledge that in the article, why did they publish her full name and a short description of her linkedin just to make it even easier to find her? What motivation did they have to do this?

But they hid the name of the software engineer that spoke on her credibility? Something seems a little off, either on the source's side or on the distributor's side.

[+] nerdponx|5 years ago|reply
Maybe the same reason why a journalist tried to dox Slate Star Codex. Which is to say, who knows but it probably isn't good.
[+] CGamesPlay|5 years ago|reply
It sounds like she published an internal memo, somebody else leaked it, and the personal information was a fourth party's.
[+] 37ruudueuej|5 years ago|reply
The simplest answer is that journalism relies heavily on credibility and BuzzFeed's brand has almost none of that as far as the public is concerned. If this was the WaPo or NYT they could probably get away with the usual "We'd tell you but that would be unprofessional."
[+] brundolf|5 years ago|reply
> Still, she did not believe that the failures she observed during her two and a half years at the company were the result of bad intent by Facebook’s employees or leadership. It was a lack of resources, Zhang wrote, and the company’s tendency to focus on global activity that posed public relations risks, as opposed to electoral or civic harm.

> “Facebook projects an image of strength and competence to the outside world that can lend itself to such theories, but the reality is that many of our actions are slapdash and haphazard accidents,” she wrote.

> “We simply didn’t care enough to stop them”

This is the key takeaway, IMO. Not as an excuse for Facebook, but as an indictment of "slapdash" information technology in general, particularly social media. It's becoming more and more clear that "bringing the world closer together" is a pandora's box, one that Facebook is not equipped (motivated?) to deal with the consequences of. Maybe no company ever could be. Maybe this is simply a thing that shouldn't exist.

[+] strangeloops85|5 years ago|reply
"“I have personally made decisions that affected national presidents without oversight, and taken action to enforce against so many prominent politicians globally that I’ve lost count,” she wrote."

The scale of how the platform's being used for political manipulation in every country is enormous, and it's clear that if a junior data scientist is having to independently make these decisions, that there's little interest in proactively dealing with this.

[+] ummonk|5 years ago|reply
What is the public interest in publishing her name after she has expressed concerns about her safety? Shame on Buzzfeed.

"In her post, Zhang said she did not want it to go public for fear of disrupting Facebook’s efforts to prevent problems around the upcoming 2020 US presidential election, and due to concerns about her own safety. BuzzFeed News is publishing parts of her memo that are clearly in the public interest."

[+] komali2|5 years ago|reply
She's not exactly going at lengths to hide her identity, I'm watching it blow up on twitter with her named by full name.
[+] reaperducer|5 years ago|reply
It’s journalism 101. You provide the identity of your source to help the reader evaluate his credibility.

Anonymous sources are supposed to be used only in extreme circumstances. But these days that gets abused all the time.

The New York Times has published its rules for making a source anonymous, and they’re pretty good, IMO.

[+] LatteLazy|5 years ago|reply
I get frustrated over these pairs.

What are Facebook supposed to do? They could spend billions moderating every comment and like, but they'd piss off every politicians world wide and the users would all cry censorship (and that's if they get it perfectly correct). They could pick a side, but the same would apply with slightly fewer pissed off people. They could do nothing and save billions and piss off less people.

And in the background, a small number of people continue to manipulate everything you see in legacy media, and no one really cares because we're used to it. Seriously. What the fuck?

[+] evolve2k|5 years ago|reply
I can see it in my least tech savvy, least educated friends. As Facebook users they seem to be radicalising the longer I leave them to the devices.

But what’s the alternative?

If people want family & friends social media, where to go?

Aren’t the open/alternative platforms just as open to abusive, if not more so, as no-one like the whistleblower is even hired when it comes to open platforms?

[+] sibeliuss|5 years ago|reply
The alternative is introducing some regulation around patterns of usage.
[+] bryan_w|5 years ago|reply
What's scary is that with all the resources that FB has, it still has to prioritize enforcement, which means that platforms like reddit or even HN have no chance of catching this.
[+] carabiner|5 years ago|reply
Missing from article is any causality between Facebook bot farms and any real world effects, election outcomes or deaths. It just says, oh there were a million fake likes on a post in this country... months later some political unrest. Like this has never happened before Facebook?
[+] aww_dang|5 years ago|reply
Bots and fake accounts are another form of advertising. Governments and political parties manipulated, influenced or controlled legacy media. Online and offline politicians are disseminating misleading political ads. Partsian news networks attack their opponents all day long while claiming to be objective.

On the surface the outrage seems misplaced. This seems like business as usual.

Perhaps the outrage isn't misplaced if the goal is regulatory capture and entrenchment of the social media space. Imagine a world where "fact-checking" and identity verification is mandated by regulators as a prerequisite to posting online. This wave of censorship will be buoyed by a tide of righteous indignation.

https://www.smh.com.au/business/companies/governments-are-go...

[+] 3gg|5 years ago|reply
Something I have always failed to understand is why there are people who still work for this company. She states “I know that I have blood on my hands by now”; doesn't everyone who works there? At this point, it is well known by everyone that this is a product flawed to the core. It is maintained by a company that insists is not a media company to evade all social responsibility, and insists that its AI will solve the unsolvable problem of moderation at scale. Ethical alternatives of federated social networks already exist. Why do people still work there? Do they not care?
[+] luckylion|5 years ago|reply
“One of the big tools of authoritarian regimes is to humiliate the opposition in the mind of the public so that they're not viewed as a credible or legitimate alternative,” she told BuzzFeed News. “There's a chilling effect. Why would I post something if I know that I'm going to deal with thousands or hundreds of these comments, that I'm going to be targeted?”

That's not just a tool for authoritarian regimes, that's pretty much the most used tool in any form of political conflict, in any country.

[+] babesh|5 years ago|reply
It’s weird watching Neal Stephenson novels come to life: miasma, apm, corporate-states, virtual worlds, mind viruses.
[+] Animats|5 years ago|reply
So what happened to Facebook's "real names" policy? If they got serious about that, fake accounts would be less of a problem.
[+] rodonn|5 years ago|reply
It's still a policy and they try hard to enforce it, but despite taking down literally billions of fake accounts each year, it is hard to stop 100% of it.
[+] clomond|5 years ago|reply
Is it just me, or does it seem like, with both social media and “tech” in general, that the ‘regulation axe’ is grinding - and that it is only a matter of time that these algorithms core to these companies’ business models ‘suffer’ from likely blunt, harsh regulatory instruments that will broadly stop this kind of influence and manipulation.

By doing so, it will also significantly harm these business models (and valuations) as we know it today.

[+] ManlyBread|5 years ago|reply
The article claims that these kind of manipulation caused them to be reported by international news but this is the first time I ever hear about any of the examples listed by the article, which leads me to believe that these kind of manipulations doesn't really have that much power.
[+] 1vuio0pswjnm7|5 years ago|reply
Love the use of the term "inauthentic". They cannot say "fake" anymore.
[+] tareqak|5 years ago|reply
The only way anything will happen to Facebook is if these three things actually happen in sequence and within a short period of time of the first event occurring.

1) Facebook wittingly or unwittingly ignores political manipulation on its platform within the United States of America that demonstrably affects US political outcomes.

2) All necessary parts of the US government required to hold a corporation like Facebook accountable for 1) act in concert to do so.

3) The US mainstream media extensively reports on 1) and 2).