This article criticizes Facebook for firing the human editors that had been keeping things sane. However they were pushed to that by accusations of bias in the right-wing media. Accusations that looked likely to lead to a Congressional investigation.
Now they are in a situation where they are damned if they do, damned if they don't. And people immersed in echo chambers will accuse them of bias no matter what.
But the entire system is fundamentally broken. Pay per ad incentives lead to rewarding viral content. And content that induces outrage is far more likely to go viral than pretty much anything else. Plus it goes viral before people do pesky things like fact checks. And the more of this that you have been exposed to, the more reasonable you find outrageous claims. Even if you know that the ones that you have seen were all wrong.
For an in depth treatment of the underlying issues, I highly recommend Trust Me, I'm Lying.
"OMG, Trump has won through lies and deception! We failed to stop him. How on Earth did that happen? We must out-manipulate our opponents next time."
If you read between the lines, this is what the article condenses to.
The discussion here is mostly creepy groupthink shit.
Social networks fact-checking their content? What's next? Should AT&T stop the spread of misinformation over its phone lines? Should USPS fact-check your mail?
Facebook is not a real news source and never going to be one. At best it's a communication medium. At worst it's a giant propaganda machine. Any moves to get it further away from the former and closer to the latter are just machinations to change who benefits from the propaganda and nothing else.
We don't need an "improved" Facebook. We need a working replacement for old-school newspapers, TV stations and radio channels. The "new media" eroded all of those, but failed (so far, at least) to provide anything of equal utility and value. Hence all the issues involved in the coverage of these elections.
The problem is the users, not the platform. If users propagate misinformation, the users propagate misinformation.
The easy solution people often jump to is fighting negatives with negatives, assuming it yields a positive. But often a positive approach is more effective. Offering incentives for good acts, not just disincentives for bad acts, is a fairly popular recent trend backed by research.
In this case I can't help but think that the best solution is to focus on education and the propagation of correct information, not censorship (or at least, something that smells like censorship) of bad information. If "new media" is a problem we should be fighting it closer to the source.
I don't know what Facebook's role in that would be, but ideally, as a platform, it would be minimal.
But ultimately, it's worth remembering that it's hard to build a good system with bad raw materials. If people are interested in falsehoods and echo chambers, their social media will reflect that.
The irony here is also the idea that all of this so-called propaganda comes from the right.
The left is constantly selecting it's own set of "facts" and tactfully leaves out whatever doesn't agree with their agenda.
Simply look at all the disingenuous pro-Clinton fact checking throughout the election. Just look at the number of rape, assault, race-baity bullshit that is circulated by liberals every day. There articles may not fly in the face of science, but they are usually equally misleading.
Is this a veiled attempt at trying to silence wikileaks? Infowars? There is a reason they are being called the regressive left. They want some articles to be banned from social media because they believe they aren't true. They believe, yet again, that people should not be allowed to make their own decisions in life.
What would be so bad about at&t offering a service that banned fraudsters from calling you? Or if the USPS refused to deliver mail that said "URGENT TIME SENSITIVE" on the envelope unless the sender could prove it really was urgent?
As someone with elderly relatives who've fallen for scams over email (which major email providers already do block, automatically), I would celebrate both of those outcomes.
> "OMG, Trump has won through lies and deception! We failed to stop him. How on Earth did that happen? We must out-manipulate our opponents next time."
This is a ridiculous oversimplification of a complex and important set of issues.
The issue is the spread of facts vs misinformation, not liberalism vs conservativism, nor Democrats vs Republicans, nor Trump vs anti-Trump. Facts can work on both sides, as can misinformation. It's kind of fucked up to assume facts somehow only go one way.
> Social networks fact-checking their content? What's next? Should AT&T stop the spread of misinformation over its phone lines? Should USPS fact-check your mail?
Well, gee... Let's think carefully. Are your private phone calls on AT&T a public discourse? Is your mail? Do either AT&T or USPS signal boost some of your discussions during public transmittal? Is your analogy even logical?
Facebook is a communication medium, as you said. Moreover, it has a set of rules and policies governing what can be shared, which shared content gets shown to a given user, and how the content is further propagated and boosted. The entire discussion, which your post is completely sidestepping, is what content should or shouldn't be propagated and boosted.
One can make arguments for more rules or fewer on this, for different rules or keeping the same rules. But asserting that it is a non-issue or drawing false equivalences are non-arguments irrelevant to the discussion at hand.
facebook will be less effective as a medium/content distributor if people become aware of the filtering that they do; i think they are shooting themselves in the foot if they start to actively censor the messages, people will just move on to twitter for politics and use facebook for the personal stuff only.
trust the market, there is plenty of competition that puts things into their proper place. Facebook doesn't own their users either.
There is no actual problem: social networks are simply not the right media to be a news source, and they were never meant to be.
Moreover, to expect to be fed "correct" information all the time, without no effort on the consumer's part, is flat out delusional. What we need, and what we have always needed, is to apply critical thinking.
Do you believe everything someone says? Well, then you have a problem.
I agree that the idea of "fact-checking" stories is troubling and is overall a bad idea.
You are not at all addressing what is actually in the article: the Newsfeed feature that surfaces the most shared stories. Facebook isn't discussing deleting homeopathy, Clinton body count, or Trump kompromat articles from our various Facebook walls.
They can fact check whatever they want, it's their site. It's stupid but people that should leave to greener pastures, it's not like in the beginning there was Facebook and we're bound to use that forever.
The problem is people want to 1) trust blindly and 2) not be taken advantage of that. You can only have one of those.
Such a comment betrays an incredibly unsophisticated understanding of the issues being discussed. Perhaps a revisit to McLuhan's maxim would be enlightening:
It's also on us to resist the temptation to build social media bubbles around ourselves, and to poke into each other's bubbles. Every time I've wanted to block a friend or relative on facebook, I've put my phone down and come back to it later. Looking away from people we disagree with isn't working.
My cousin shared a post this morning asking why people on the left aren't celebrating the fact that a female campaign manager helped put someone in the white house for the first time. I wasn't sure what to say as his friends piled on to say things like "Yeah, I thought they were for women's rights?!" Here's the response I finally came up with:
I'm not celebrating her "success" because I imagine my facebook and twitter feeds look a lot different than yours. My feeds are filled with first-hand stories from women around the country who are being more openly harassed than they were last week. It's happening often enough that's it's really dismissive to say "Oh those are just a few assholes." People are openly harassing women in the name of Trump.
I think when we act as consumers of social media we need to stop building our own bubbles, and reach out into other's bubbles. And when we help build social networks, we need to intentionally structure them in a way that maintains connections, rather than isolating individuals and groups.
When I knew people who supported Romney, I felt no need based on that to exclude them from my life or my social media circles.
One's ability to support Trump tells me much more about a person. There are too many things that are good and should be fundamental about a functional, enlightened society that one must reject in order to support Trump. Prejudice, fraud, bullying, and sexual harassment must all be accepted.
People who would accept these things are not welcome in my life. It's not because they're on the other side. It wasn't like this in 2008 or 2012. This time it goes deeper than that.
There was a comment that just got deleted wondering about if people are just roaming around for liberals to hate on. The answer to that is yes. This just posted on my feed [1]. Driving around Wellesley (also where Clinton graduated from) taunting people.
I think it would be relatively easy to have an "auto-snopes" feature which detects a URL being shared and immediately attaches a post that says "this has been debunked by X"
For example - I've seen dozens of links to Michael Moore's trumpland speech which strategically ends with the phrase "America will elect Trump and it will feel great"
He concludes:
Yes, on November 8, you Joe Blow, Steve Blow,
Bob Blow, Billy Blow, all the Blows get to go
and blow up the whole goddamn system because
it's your right. Trump's election is going to
be the biggest fuck ever recorded in human
history and it will feel good.
EDIT - I feel like I should clarify I mean "auto-snopes" figuratively. Auto fair-balance might be a better descriptor. Something that links to an opposing opinion automatically, in cases of absolute falsehood the debunking, or even a CSS CLASS where there is a bright red "FALSE" wrapper.
Snopes doesn't have to be the automatic choice.
Breaking filter-bubbles & ensuring truth would be my goal on an idea like this. Not strictly "promoting liberal media"
I have personally asked an individual why they don't trust snopes.
Their claim is that it's a Democrat rag towel and shill.
I then ask, then what fact-checker would you trust?
Their response: None of them.
Congratulations. We've reached a point, for this individual, where "facts" as decided in the common forum are suspect and the only thing they trust is themselves (and whatever non-Mainstream Media they listen to). We have reached a point where the only truth is what they decide is the truth.
That is a non-trivial problem to resolve. If we can't even agree what basic facts are, there is no way to even have a discussion.
On the days leading up to the election, my feed was full of fake news stories. Every single one that looked fishy to me, Snopes had debunked. I posted a few replies, but I felt like it was a Sisyphean task of shoveling shit against the tide of misinformation.
Is there any hope that Facebook will do the right thing and put information ahead of profits? As a libertarian at heart, it pains to to say this, but I think the only way to fix this is government regulations. Of course it will never happen under Trump, but how else can force them to be good curators with all the power they wield.
the problem is the deeper fundamental philosophical conversations about truth and the meaning of truth and relativism and biases etc aren't being discussed at in the tech community let alone at FB/Twitter HQ.
Having a lil fact checker isn't as easy as adding a button to snopes.
There's bias in language itself and how even a sentence is structured. But again a deeply complex conversation nobody seems to be having in this community.
My hope is for a similar thing to exist in real-time in future political debates, where live fact-checking spits back at anyone who speaks a mis-truth, nipping B.S. in the bud.
Unfortunately, sites like snopes and politifact also succumb to the same type of bias as the right-wing sites. See this article where politifact was handed propaganda material from the Clinton foundation and parroted it without doing any actual checking about AIDS drugs the Clinton foundation was funding:
Everyone has an agenda, follow the money, and trust no one. Whether it's right-wing like Alex Jones or Left-wing like the Tampa Bay Times a.k.a politifact, you need to be suspicious and do your own research if you want the truth.
If Facebook wants to be a "source of truth" they're going to need to hire moderators outside of Silicon Valley that represent a wide array of values, traditions, and backgrounds, in a ratio that represents the actual population, and have them peer review each other without fear of repercussion from their employer. I think it's hard for the Silicon Valley types to surrender this type of control... For instance, would this article even be on HN if Hillary had won the election? If we're talking about neutrality, that's an important point to consider.
Actually, the lack of multiculturism is true for quite a few Silicon Valley companies. Try to find a station that has 'Today's Metal' on Google Music, good luck ...there are a couple hundred different indie channels to listen to that are very meticulously organized into genre, sub-genre, and sub-sub-genres. I'm not complaining, I simply use different products, but I think it points back to the source of where the lack of multiculturism stems from.
You think the users are too stupid to fact check. But what makes you think the users care about the truth anyway?
More often than not they double down with "well it's emblematic of the greater problem" or something to that effect. They then look for new evidence to support their views. Like it or not, some sites are designed to build echo chambers and are not for general discourse.
A lot of people don't care. They believe an image on facebook that reaffirms their belief.
You try to counter with statistics from the FBI, DOJ, BLS, DOL, or any other organization and the numbers are 'made up'. And studies have been done that when presented with counter arguments it strengthens the original misperceptions.
Corrections also don't get much traction. I've seen many conspiracy theories pop up for a week, then die down. While the correct gets no traction at all.
>As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases.
When playing identity politics, agreeing with a blatant lie is expected when you need to affirm the narrative of your tribe. If you don't, you're a heretic and not a true believer.
If you do a root cause analysis of this election's result, much of it points back to Silicon Valley in one way or another.
Workforce automation created millions of unemployed people in places far away from economic hubs, with no hope of employment in their hometowns or re-skilling for the new roles, leaving them desperate. Proliferation of mobile apps lead to an explosion in use of social media. The wild popularity of this democratic social media encouraged a culture that allowed misinformation to flourish without a lot of consequences.
Bad actors took advantage of this, and reinforced distorted false narratives. Desperate people latched onto the messages presented to them, that gave them solutions repeatedly reinforced by social media, and many took an enormous risk - they voted against their best interests in order to solve their problems with the information they had available. And that is how we got here today.
>Workforce automation created millions of unemployed people in places far away from economic hubs, with no hope of employment in their hometowns or re-skilling for the new roles, leaving them desperate.
So did the Industrial Revolution.
We may mock the Luddites for breaking the Jacquard looms. But do you know what the punishment for frame-breaking was? Execution.
The Industrial Revolution made way for the tertiary-sector economy. When the tertiary-sector economy becomes redundant, what is next?
This is seriously hilarious. Now that's he's won, against seemingly all odds, we have to find some other reason for why EVERYONE ELSE, was completely WRONG. Every media outlet, everyone in politics, all the polls, completely shit on Trump. Now look who is the most powerful person in the entire world.
The Democrats, and the rest, did it to themselves. Overconfidence, smear campaigns, and all of the morons doing the polls.
We are all sick of politicians, and the people have spoken, let's just see what happens. Get over it already.
The polls weren't that off actually. They missed the mark overall by about 2%, which is actually pretty similar to the error in past elections. That's not very surprising because without being able to sample the actual distribution you need to make some assumptions for the statistics to work.
Another place where Facebook's failure to detect fakeness has proved costly is in their social graph.
A lot of sites moved away from comments platforms like Disqus to Facebook in the hope that the quality of discourse would improve and trolling would decrease. Instead, clicking on some of the most vehement commentators' names would invariably lead to suspiciously bare accounts (and often suspiciously fake names) with a handful of friends themselves, all with similar characteristics. Unfortunately the people who are influenced by this sort of thing are not tech savvy enough to do even this basic level of checking.
There is a a sort of "uncanny valley" that a technically savvy and experienced person can detect when looking at a fake profile, that I daresay Facebook's algorithms just can't.
Then there is also the problem that Facebook really doesn't care about blatantly fake accounts until they are reported. The sorts of people who are trapped in some filter bubbles are unlikely to be savvy enough to know how to report these profiles (I've reported many-dozens at least-ranging from community noticeboards to cupcake businesses, all pretending to be people and missed by FB's much vaunted ML)
Of course, there is a certain irony in using a throwaway account to discuss fake accounts, but without a patina of "realness", it triggers a greater level of skepticism, which is a good thing.
When the history of 2016 is written, a large part will be filter bubbles and trolls expertly manipulating huge swathes of electorates enabled by the hubris and greed of Social Media networks.
Except it isn't just the spread of misinformation. Even more important (IMHO) is the insulating effect of showing people only like-minded opinions, effectively trapping everyone in a bubble. I can't think of any way Facebook can solve this problem without drastically decreasing their reliance on ad revenue. What are they going to do, force people to view opinions they disagree with?
> Last week Buzzfeed reported on an entire cottage industry of web users in Macedonia generating fake news stories related to Trump vs Clinton in order to inject them into Facebook’s Newsfeed as a way to drive viral views and generate ad revenue from lucrative US eyeballs.
Does anyone else find this incredibly ironic? Buzzfeed ratting out others doing clickbait?
Not long ago I saw a post about the oil pipeline protests. It was a photo of a huge crowd, and stated that the media was not covering the protests properly and that people are taking a stand. The photo was of tens of thousands of people, and had over 230,000 shares. 230k....
A quick image lookup showed that it was in fact a photo from Woodstock 1969, but since the comments on the photo were restricted, nobody had been able to point it out.
I went to report it, but facebook seems to have removed the "misinformation" option when reporting content (thought there was one before?)
Trump's strategy worked brilliantly on social media. Here's the strategy in a nutshell:
1) Say something outrageous that news orgs will grab for a quick, clickbait story that will generate tons of ad revenue.
2) Let outraged people share via social media.
3) Benefit as some of the people seeing the content will not disagree and will take the candidacy seriously.
4) Win.
It doesn't matter what Facebook does with its trending section, the real value of Trump's strategy came with the way it exploits the basic sharing / newsfeed mechanism's intended behavior. Even today, many Trump opposers think that they were helping by posting their outrage at every rude comment Trump said.
Let's be frank here: Free speech and facebook never go together. They want to present as a respectable forum where prominent people like politicians can have their platform (same with Twitter). That needs some kind of filtering. And, as we saw with pretty much the entire American press, as soon as you start filtering and selecting, you start to induce gross biases and distortions of the truth. The truth only comes out in a clutter of contradicting opinions, engaging discussions, pieces of evidence and leaked materials. Of course, that is an ugly mess not many people want to put up with.
I am not a Trump supporter, as far as his anti-Muslim stance is concerned. But I do support some of his economic stances which are labeled as "protectionist" by mainstream media. I may be wrong as I am not an economic expert also his take on immigration is certainly a thing of significance.
A related note regarding Facebook here: AFAIK, Facebook's curators were very biased and were always removing anything to do support what "mainstream vocal leftists" find objectionable. e.g. Be it to do with the shameless suppression of news related to people like Geert Wilders or Pamella Geller who are not by any means right wing fanatics. Geert Wilders is a staunch supporter of homosexuals. Just because he criticizes the barbaric ideology of Islam he is labeled as a "right-wing nut" by leftists with covert/overt Islam-apologetic stances.
It is well known that Saudi kingdom has large investments in mainstream US media, that's what Trump drew people's attention to. Who knows how much of Facebook is controlled by Saudi and the likes. [1]
It's no surprise that mainstream media got the prediction about Trump wrong. I guess, there predictions and poll-results were in fact propaganda against Trump. I felt it that way, many people I know felt it and I am sure many more people must also have felt it.
This is a tough position for FB. No matter what they do, half of the the world will think the actions they take are wrong. Most humans are intelligent enough to know that stories like Hillary Clinton running a child sex ring (or whatever) are false. People choose to share these because they want to believe it or think it's funny. As they say in their mission statement "Facebook’s mission is to give people the power to share and make the world more open and connected." it is a commendable mission but it's also a messy one. I think inevitably they will have to do something. Maybe some kind of "truth barometer" on stories that attain a certain volume of engagement. At least then they can say "look! we're trying" without taking away the people's right to troll. In this particular election though I doubt anything would have helped. The winner has been openly trolling for years, his direct actions set the standard for truthfulness much lower than ever before.
It seems abundantly clear that the problem isn't low-quality content in the feed. That's unavoidable. It's the very small trickle of high-quality content in the feed – content from actual people who want to contribute to Facebook. There isn't much of it. And Facebook wants to show a fresh feed every refresh.
The timing of this announcement does not seem to help assuage fears that facebook was using its platform to push an agenda. It was this accusation that initially forced them to switch from human editors to an algorithm for trending news. I suspect that we as a country are going to have a conversation regarding the nature of privately held nearly public spaces on the internet and what their obligation to us is, if any.
In corporate speak "misinformation" is defined as "any information that doesn't support our narrative". Certainly it is the goal of Facebook, the government, and every other large media corporation to control the flow of information. You need look no farther then Obama's recent statement that news needs to be "curated" by government appointed gatekeepers, to filter out all that pesky information that doesn't jive with government propaganda.
"As the Americans learned so painfully in Earth's final century, free flow of information is the only safeguard against tyranny. The once-chained people whose leaders at last lose their grip on information flow will soon burst with freedom and vitality, but the free nation gradually constricting its grip on public discourse has begun its rapid slide into despotism. Beware of he who would deny you access to information, for in his heart he dreams himself your master."
[+] [-] btilly|9 years ago|reply
See, for example, http://thehill.com/policy/technology/279361-top-republican-d....
Now they are in a situation where they are damned if they do, damned if they don't. And people immersed in echo chambers will accuse them of bias no matter what.
But the entire system is fundamentally broken. Pay per ad incentives lead to rewarding viral content. And content that induces outrage is far more likely to go viral than pretty much anything else. Plus it goes viral before people do pesky things like fact checks. And the more of this that you have been exposed to, the more reasonable you find outrageous claims. Even if you know that the ones that you have seen were all wrong.
For an in depth treatment of the underlying issues, I highly recommend Trust Me, I'm Lying.
[+] [-] colllectorof|9 years ago|reply
If you read between the lines, this is what the article condenses to.
The discussion here is mostly creepy groupthink shit.
Social networks fact-checking their content? What's next? Should AT&T stop the spread of misinformation over its phone lines? Should USPS fact-check your mail?
Facebook is not a real news source and never going to be one. At best it's a communication medium. At worst it's a giant propaganda machine. Any moves to get it further away from the former and closer to the latter are just machinations to change who benefits from the propaganda and nothing else.
We don't need an "improved" Facebook. We need a working replacement for old-school newspapers, TV stations and radio channels. The "new media" eroded all of those, but failed (so far, at least) to provide anything of equal utility and value. Hence all the issues involved in the coverage of these elections.
[+] [-] B-Con|9 years ago|reply
The problem is the users, not the platform. If users propagate misinformation, the users propagate misinformation.
The easy solution people often jump to is fighting negatives with negatives, assuming it yields a positive. But often a positive approach is more effective. Offering incentives for good acts, not just disincentives for bad acts, is a fairly popular recent trend backed by research.
In this case I can't help but think that the best solution is to focus on education and the propagation of correct information, not censorship (or at least, something that smells like censorship) of bad information. If "new media" is a problem we should be fighting it closer to the source.
I don't know what Facebook's role in that would be, but ideally, as a platform, it would be minimal.
But ultimately, it's worth remembering that it's hard to build a good system with bad raw materials. If people are interested in falsehoods and echo chambers, their social media will reflect that.
[+] [-] jklinger410|9 years ago|reply
The left is constantly selecting it's own set of "facts" and tactfully leaves out whatever doesn't agree with their agenda.
Simply look at all the disingenuous pro-Clinton fact checking throughout the election. Just look at the number of rape, assault, race-baity bullshit that is circulated by liberals every day. There articles may not fly in the face of science, but they are usually equally misleading.
Is this a veiled attempt at trying to silence wikileaks? Infowars? There is a reason they are being called the regressive left. They want some articles to be banned from social media because they believe they aren't true. They believe, yet again, that people should not be allowed to make their own decisions in life.
[+] [-] samsonasu|9 years ago|reply
As someone with elderly relatives who've fallen for scams over email (which major email providers already do block, automatically), I would celebrate both of those outcomes.
[+] [-] briholt|9 years ago|reply
[+] [-] erdevs|9 years ago|reply
This is a ridiculous oversimplification of a complex and important set of issues.
The issue is the spread of facts vs misinformation, not liberalism vs conservativism, nor Democrats vs Republicans, nor Trump vs anti-Trump. Facts can work on both sides, as can misinformation. It's kind of fucked up to assume facts somehow only go one way.
> Social networks fact-checking their content? What's next? Should AT&T stop the spread of misinformation over its phone lines? Should USPS fact-check your mail?
Well, gee... Let's think carefully. Are your private phone calls on AT&T a public discourse? Is your mail? Do either AT&T or USPS signal boost some of your discussions during public transmittal? Is your analogy even logical?
Facebook is a communication medium, as you said. Moreover, it has a set of rules and policies governing what can be shared, which shared content gets shown to a given user, and how the content is further propagated and boosted. The entire discussion, which your post is completely sidestepping, is what content should or shouldn't be propagated and boosted.
One can make arguments for more rules or fewer on this, for different rules or keeping the same rules. But asserting that it is a non-issue or drawing false equivalences are non-arguments irrelevant to the discussion at hand.
[+] [-] rtpg|9 years ago|reply
Imagine if USPS sent you things that were sent to you, but also random packages they thought were good for you.
Facebook, even just through its algorithm, is exercising some editorial control and distribution.
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] MichaelMoser123|9 years ago|reply
trust the market, there is plenty of competition that puts things into their proper place. Facebook doesn't own their users either.
[+] [-] rocho|9 years ago|reply
There is no actual problem: social networks are simply not the right media to be a news source, and they were never meant to be.
Moreover, to expect to be fed "correct" information all the time, without no effort on the consumer's part, is flat out delusional. What we need, and what we have always needed, is to apply critical thinking.
Do you believe everything someone says? Well, then you have a problem.
[+] [-] linkregister|9 years ago|reply
You are not at all addressing what is actually in the article: the Newsfeed feature that surfaces the most shared stories. Facebook isn't discussing deleting homeopathy, Clinton body count, or Trump kompromat articles from our various Facebook walls.
[+] [-] lerpa|9 years ago|reply
The problem is people want to 1) trust blindly and 2) not be taken advantage of that. You can only have one of those.
[+] [-] lingben|9 years ago|reply
https://en.wikipedia.org/wiki/The_medium_is_the_message
https://www.youtube.com/watch?v=Ko6J9v1C9zE
https://www.youtube.com/watch?v=UoCrx0scCkM
[+] [-] erdevs|9 years ago|reply
[deleted]
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] japhyr|9 years ago|reply
My cousin shared a post this morning asking why people on the left aren't celebrating the fact that a female campaign manager helped put someone in the white house for the first time. I wasn't sure what to say as his friends piled on to say things like "Yeah, I thought they were for women's rights?!" Here's the response I finally came up with:
I'm not celebrating her "success" because I imagine my facebook and twitter feeds look a lot different than yours. My feeds are filled with first-hand stories from women around the country who are being more openly harassed than they were last week. It's happening often enough that's it's really dismissive to say "Oh those are just a few assholes." People are openly harassing women in the name of Trump.
I think when we act as consumers of social media we need to stop building our own bubbles, and reach out into other's bubbles. And when we help build social networks, we need to intentionally structure them in a way that maintains connections, rather than isolating individuals and groups.
[+] [-] gdulli|9 years ago|reply
One's ability to support Trump tells me much more about a person. There are too many things that are good and should be fundamental about a functional, enlightened society that one must reject in order to support Trump. Prejudice, fraud, bullying, and sexual harassment must all be accepted.
People who would accept these things are not welcome in my life. It's not because they're on the other side. It wasn't like this in 2008 or 2012. This time it goes deeper than that.
[+] [-] withdavidli|9 years ago|reply
[1] https://m.facebook.com/?_rdr#!/story.php?story_fbid=10210780...
[+] [-] killwhitey|9 years ago|reply
I made something to replicate that via Twitter Lists[3][4], but it seems like making one for Facebook is much more important.
[1] https://twitter.com/twitter/status/73833309163110400
[2] https://news.ycombinator.com/item?id=12117218
[3] https://otherside.site/
[4] https://gist.github.com/0x263b/7b391a1617fcbbabc57fb1e705884...
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] alexc05|9 years ago|reply
For example - I've seen dozens of links to Michael Moore's trumpland speech which strategically ends with the phrase "America will elect Trump and it will feel great"
Sample: http://www.zerohedge.com/news/2016-10-25/michael-moore-trump...
but the truth is that is not what he CONCLUDEDHe continues: https://www.youtube.com/watch?v=sVLTQIUMq18&t=30 [sic]
EDIT - I feel like I should clarify I mean "auto-snopes" figuratively. Auto fair-balance might be a better descriptor. Something that links to an opposing opinion automatically, in cases of absolute falsehood the debunking, or even a CSS CLASS where there is a bright red "FALSE" wrapper.Snopes doesn't have to be the automatic choice.
Breaking filter-bubbles & ensuring truth would be my goal on an idea like this. Not strictly "promoting liberal media"
[+] [-] Spellman|9 years ago|reply
Their claim is that it's a Democrat rag towel and shill.
I then ask, then what fact-checker would you trust?
Their response: None of them.
Congratulations. We've reached a point, for this individual, where "facts" as decided in the common forum are suspect and the only thing they trust is themselves (and whatever non-Mainstream Media they listen to). We have reached a point where the only truth is what they decide is the truth.
That is a non-trivial problem to resolve. If we can't even agree what basic facts are, there is no way to even have a discussion.
[+] [-] keithnz|9 years ago|reply
so you might get a panel with
SNOPES : FALSE NYT: FALSE HUFF: MIXED DAVID AVOCADO WOLFE: QUANTUM CRYSTAL HARMONIZED
[+] [-] lbenes|9 years ago|reply
Is there any hope that Facebook will do the right thing and put information ahead of profits? As a libertarian at heart, it pains to to say this, but I think the only way to fix this is government regulations. Of course it will never happen under Trump, but how else can force them to be good curators with all the power they wield.
[+] [-] debt|9 years ago|reply
Having a lil fact checker isn't as easy as adding a button to snopes.
There's bias in language itself and how even a sentence is structured. But again a deeply complex conversation nobody seems to be having in this community.
[+] [-] RankingMember|9 years ago|reply
My hope is for a similar thing to exist in real-time in future political debates, where live fact-checking spits back at anyone who speaks a mis-truth, nipping B.S. in the bud.
[+] [-] 234dd57d2c8db|9 years ago|reply
http://www.politifactbias.com/2016/11/the-daily-caller-polit...
The problem is a lack of education by the consumer. I suggest everyone read this book as an intro on the topic:
https://www.amazon.com/How-Lie-Statistics-Darrell-Huff/dp/03...
Everyone has an agenda, follow the money, and trust no one. Whether it's right-wing like Alex Jones or Left-wing like the Tampa Bay Times a.k.a politifact, you need to be suspicious and do your own research if you want the truth.
[+] [-] exabrial|9 years ago|reply
Actually, the lack of multiculturism is true for quite a few Silicon Valley companies. Try to find a station that has 'Today's Metal' on Google Music, good luck ...there are a couple hundred different indie channels to listen to that are very meticulously organized into genre, sub-genre, and sub-sub-genres. I'm not complaining, I simply use different products, but I think it points back to the source of where the lack of multiculturism stems from.
[+] [-] trynumber9|9 years ago|reply
More often than not they double down with "well it's emblematic of the greater problem" or something to that effect. They then look for new evidence to support their views. Like it or not, some sites are designed to build echo chambers and are not for general discourse.
[+] [-] faet|9 years ago|reply
You try to counter with statistics from the FBI, DOJ, BLS, DOL, or any other organization and the numbers are 'made up'. And studies have been done that when presented with counter arguments it strengthens the original misperceptions.
Corrections also don't get much traction. I've seen many conspiracy theories pop up for a week, then die down. While the correct gets no traction at all.
>As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases.
http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf
People don't want news, they want to hear what they already 'know'.
[+] [-] travmatt|9 years ago|reply
[+] [-] rwhitman|9 years ago|reply
Workforce automation created millions of unemployed people in places far away from economic hubs, with no hope of employment in their hometowns or re-skilling for the new roles, leaving them desperate. Proliferation of mobile apps lead to an explosion in use of social media. The wild popularity of this democratic social media encouraged a culture that allowed misinformation to flourish without a lot of consequences.
Bad actors took advantage of this, and reinforced distorted false narratives. Desperate people latched onto the messages presented to them, that gave them solutions repeatedly reinforced by social media, and many took an enormous risk - they voted against their best interests in order to solve their problems with the information they had available. And that is how we got here today.
[+] [-] razakel|9 years ago|reply
So did the Industrial Revolution.
We may mock the Luddites for breaking the Jacquard looms. But do you know what the punishment for frame-breaking was? Execution.
The Industrial Revolution made way for the tertiary-sector economy. When the tertiary-sector economy becomes redundant, what is next?
There are no politicians talking about this.
[+] [-] cjjuice|9 years ago|reply
[+] [-] overcast|9 years ago|reply
The Democrats, and the rest, did it to themselves. Overconfidence, smear campaigns, and all of the morons doing the polls.
We are all sick of politicians, and the people have spoken, let's just see what happens. Get over it already.
[+] [-] tdb7893|9 years ago|reply
[+] [-] lightbyte|9 years ago|reply
>and the people have spoken
Trump lost the popular vote, so this isn't true. He got about 25% of the total voting population overall.
[+] [-] gr_thrwy|9 years ago|reply
A lot of sites moved away from comments platforms like Disqus to Facebook in the hope that the quality of discourse would improve and trolling would decrease. Instead, clicking on some of the most vehement commentators' names would invariably lead to suspiciously bare accounts (and often suspiciously fake names) with a handful of friends themselves, all with similar characteristics. Unfortunately the people who are influenced by this sort of thing are not tech savvy enough to do even this basic level of checking.
There is a a sort of "uncanny valley" that a technically savvy and experienced person can detect when looking at a fake profile, that I daresay Facebook's algorithms just can't.
Then there is also the problem that Facebook really doesn't care about blatantly fake accounts until they are reported. The sorts of people who are trapped in some filter bubbles are unlikely to be savvy enough to know how to report these profiles (I've reported many-dozens at least-ranging from community noticeboards to cupcake businesses, all pretending to be people and missed by FB's much vaunted ML)
Of course, there is a certain irony in using a throwaway account to discuss fake accounts, but without a patina of "realness", it triggers a greater level of skepticism, which is a good thing.
When the history of 2016 is written, a large part will be filter bubbles and trolls expertly manipulating huge swathes of electorates enabled by the hubris and greed of Social Media networks.
[+] [-] stevendhansen|9 years ago|reply
[+] [-] masmullin|9 years ago|reply
Does anyone else find this incredibly ironic? Buzzfeed ratting out others doing clickbait?
[+] [-] reustle|9 years ago|reply
A quick image lookup showed that it was in fact a photo from Woodstock 1969, but since the comments on the photo were restricted, nobody had been able to point it out.
I went to report it, but facebook seems to have removed the "misinformation" option when reporting content (thought there was one before?)
[+] [-] grandalf|9 years ago|reply
1) Say something outrageous that news orgs will grab for a quick, clickbait story that will generate tons of ad revenue.
2) Let outraged people share via social media.
3) Benefit as some of the people seeing the content will not disagree and will take the candidacy seriously.
4) Win.
It doesn't matter what Facebook does with its trending section, the real value of Trump's strategy came with the way it exploits the basic sharing / newsfeed mechanism's intended behavior. Even today, many Trump opposers think that they were helping by posting their outrage at every rude comment Trump said.
[+] [-] Kenji|9 years ago|reply
[+] [-] tmptmp|9 years ago|reply
A related note regarding Facebook here: AFAIK, Facebook's curators were very biased and were always removing anything to do support what "mainstream vocal leftists" find objectionable. e.g. Be it to do with the shameless suppression of news related to people like Geert Wilders or Pamella Geller who are not by any means right wing fanatics. Geert Wilders is a staunch supporter of homosexuals. Just because he criticizes the barbaric ideology of Islam he is labeled as a "right-wing nut" by leftists with covert/overt Islam-apologetic stances.
It is well known that Saudi kingdom has large investments in mainstream US media, that's what Trump drew people's attention to. Who knows how much of Facebook is controlled by Saudi and the likes. [1]
It's no surprise that mainstream media got the prediction about Trump wrong. I guess, there predictions and poll-results were in fact propaganda against Trump. I felt it that way, many people I know felt it and I am sure many more people must also have felt it.
[1] https://www.youtube.com/watch?v=Ex9ldUHSgjs
[+] [-] brentm|9 years ago|reply
[+] [-] malchow|9 years ago|reply
[+] [-] thrden|9 years ago|reply
[+] [-] StanislavPetrov|9 years ago|reply
[+] [-] guycook|9 years ago|reply