Given the history of propaganda and how we got into the Iraq war, this is an area of real major concern for me.
On two different occasions I've noticed a similar behavior on a "trending" news topic that was not trending anywhere.
Both cases (surprisingly) that caught my eyes, had to do with vilification of Iran. The first time, I assumed a certain organized PR effort or perhaps it was naturally trending.
However, the second time (May 9th) [0] it coincided with the Gizmodo's article coming out.
That prompted me to search all major news sites (with the logic that if it was on some major news sites it will then circulate within FB and will explain why it was trend).
So I decided to document this, I checked CNN, HuffingtonPost, NYTimes, WashingtonPost...no mention of an "Iranian missile test" but there was a small blurb at the bottom of the FoxNew site [1]. Not enough exposure for it to get the kind of volume to be the top-3 trending story.
Facebook says: "The list of Trending Topics is then personalized for each user via an algorithm that relies on a number of factors, including the importance of the topic, Pages a person has liked, location (e.g.. home state sports news), feedback provided by the user about previous Trending Topics and what’s trending across Facebook overall. Not everyone sees the same topics at the same time." (My italics.)
If you're interested in Iran, you will probably see stuff about Iran that hardly anyone else sees.
I have zero interest in Iran and have never seen anything about it.
Whether a story trends isn't contingent on it being placed on the front page of major news sites. Most people don't get their news that way. A story is more likely to trend because someone notable talked about it or linked to it.
Are you saying news about Iran's military actions is not important to the US and its allies?
Perhaps users on FB find it more interesting and they surface it. Why does news on FB have to mimic the mainstream media?
Many times we have to read the stories about the US as reported from other countries because the MSM misses them or decides they aren't important enough.
> Given the history of propaganda and how we got into the Iraq war, this is an area of real major concern for me.
This is how the news media has operated in the US for almost all of its existence. I wouldnt be too concerned about it, just take news you read with a grain of salt.
I've said much about this topic, as you can see in my comments. I've since thought about what should be done and I've come to a simple conclusion: Facebook is Mark Zuckerburg's business and enterprise. He can do with it what he wants, just as any business or enterprise has the right to conduct their own business. If he wants to use it to push the a globalist and left agenda, by all means he has the right to do that.
It's up to us, the consumer, to vote with our feet. I haven't used Facebook in years and I'm certainly glad not to be using it now that this comes to light as I disagree with it's agenda.
Does anyone have an alternative argument as to why Facebook DOES NOT have the RIGHT to suppress and promote information based on it's own agenda?
Sure, FB can push any agenda they want. The issue is the deception. The product is marketed as organic and representative, and a number of people are indicating this is not actually the case. I'm not real familiar with current law around this type of behavior, so of its legality I have no idea.
What I would compare it to though are other deceptive business marketing practices like photoshopping before and after pics for weight-loss products, or marketing your dog food as premium, all natural meat based when it is in fact the same junk, bought from the same supplier, as another dog food brand that costs half as much (and is specifically called out as inferior in commercials).
At the least, even if not criminal, it would seem to open the door to civil lawsuits.
Facebook dropped algorithmic news curation "after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds".
Its current "agenda" is based on reputable news sources, plus Fox News (1):
"We measure this by checking if it is leading at least 5 of the following 10 news websites: BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, Yahoo News or Yahoo."
I'd say that was a fair reflection of mainstream media.
If you object to those, what's your personal agenda based on, and how mainstream is it?
(1) Fox News has specifically said in court that it is not required to be truthful.
You know, I really doubt that Mark wakes up in the morning, meets with his editing team, and says "guys, we need to push the globalist left agenda". The company is massive, so to say that he has some kind of finger on a "left agenda" button is way over simplifying things.
>> He can do with it what he wants, just as any business or enterprise has the right to conduct their own business.
Unless you're a publicly traded company, which Facebook is. Then you are beholden to the people who are invested in your company; some of which I would assume are probably conservative and right leaning.
If FB was still a private company, then yeah, your sentiment holds true. All bets are off when you're a publicly traded and funded company. At that point, it's the investors who tell you what the direction of the company is. Don't like it? Then buyout all the shareholders and return your company back to the private realm.
>Facebook is Mark Zuckerburg's business and enterprise. He can do with it what he wants, just as any business or enterprise has the right to conduct their own business.
It may be a little nit-picky to point it out, but in the general sense this isn't true. As soon as the company went public it stopped being Mark Zuckerburg's business and enterprise, becoming instead the property of the shareholders who employ Zuckerburg to look after their company. Zuckerburg owns less than 30%.
> He can do with it what he wants, just as any business or enterprise has the right to conduct their own business.
We all have legal rights to do things that are morally wrong. He has a resposibility to the world to be a responsible citizen, just as we all do.
Prior and current generations of responsible citizens are mostly responsible for Facebook: Without generations of people doing the responsible thing, even dying, for democracy, liberty, science and technlogy, education, infrastructure, law, etc. ... he'd be living in a country and world where there would be no technology, legal rights, literacy, or many other things necessary to Facebook.
If you read the article, nothing was being "suppressed". The sources mentioned are extremely tabloid-esque in nature.
This is only biased if we can provide concrete evidence that similar tabloid-esque publications that happened to align "left" of the political spectrum in the U.S. were allowed to pass while their political counterparts were passed up.
If the standard for journalism remained constant, but the sources that were "suppressed" largely happen to be conservative that doesn't seem like a problem with curation, it seems like a problem for the conservative media ecosystem.
I fully expect, despite adhering to basic scientific principles, to be downvoted yet again in this thread, because I already see a lot of the same names from the last time this article was posted and politically charged rhetoric (i.e. "leftist agenda").
This thread makes it seem like the general opinion is that an automatically generated news feed would be better.
It wouldn't. Not for end users anyways. It would be a marketer's dream.
If you feel like that's the way things should be, write one. Make it popular. Sell it to Facebook. (And use the money to buy stock in your roommate's new online marketing firm)
That's not the point, most people have the impression that the trending news is automated and generated by some sort of algorithm. It's highly misleading. The trends should really be called "Facebook's Top Picks" instead.
Google understood this long ago. You need human editors/curators who can manually flag spam. Otherwise your index will be filled with spammers who gamed the algorithm correctly. Those that argue differently never was responsible for such a large-scale search/media operation.
That's true. The issue here isn't that Facebook tried and failed to implement an automatically generated news feed. The issue is they were called out for bias in the news feed, and then tried to deflect by blaming it on a computer.
You can't get away from human judgement. If you stop relying on editors' judgement, you'll end up with spam and people gaming the system. To fix the spam, the process of generating the feed will have to be tweaked. The nature of these tweaks (not to mention the design of the original algorithm) reflects the judgement of the human programmers.
Doesn't even have to be commercial spam. With classic hits like Santorum and mission accomplished, I'd think they'd be grateful for a little auditable curation.
The big problem with this Facebook controversy is the amount of people getting their news from Facebook. There are a thousands of other places to consume news, and they do a better job. I almost never click on the Trending topic on Facebook, because I always that it has been curated/customized based on my profile. I actually use NYT / Twitter / Washington Post etc. to get news and trending topics
NYT and Washington Post are subject to the same kinds of biases - in fact, moreso, since what comes up is ultimately in the hands of the executive editor.
Twitter also likely uses manual curation.
If Facebook improves an algorithmic system augmented with human consensus (that diffuses the editorial control among a group of curators), it will be the least biased of all the mentioned systems.
Ironically, Facebook is getting criticized by being more like the NYT and the Washington Post by holding back stories which aren't credible or news-worthy.
I agree, but all of this hubbub is about the Trending News section. Nobody seems to be saying that Facebook put their finger on the scales of what shows up in news feeds after being shared by friends. I suspect that when people say "60% of millenials get their news primarily from Facebook", they are talking about shared articles and the like, not the "trending now" section.
Wake me up when any of the people shitting the bed about this have ever complained about the power wielded by Rupert Murdoch. I might think they were doing something other than having a tantrum about someone outside their own echo chamber having any kind of power over the news.
I always wondered - I live in the UK and "trending" topics seemed to be a mix of things that were on the BBC news 3 days ago and obviously promoted content (large company X releases Y).
Completely useless in any case so I tend never to click in that box.
There is absolutely no evidence that you can pay to get into trending.
If there were, it would be a much bigger story than this flimsy article which is rooted in the editorial judgements of a few contractors.
It really annoys me when these drive-by insinuations about "promoted content" are made. Digital media companies—including news outlets, Facebook, and Google—all actually are careful to label anything which is paid placement as such. All these baseless accusations of unlabeled paid placement do is undermine the moral standards for labeling paid placement.
If you are going to make extraordinary claims, please bring extraordinary evidence.
If you look at the totality of content many news organizations put out, there are a great number of slightly edited corporate press releases of the sort you are mentioning, and people read such things.
It is also worth noting that FB trending topics can be tuned.
Maybe they don't frame it very well but we should probably assume people are making decisions whether its the people who are building the algorithms or hand-picking the content... someone's bias is going to work its way into there.
I suspect this doesn't only happen in the trending news feed.
From my own experience: I shared an article by Douglas Roushkof (https://www.theguardian.com/technology/2016/feb/12/digital-c...) and several of my friends weren't able to see it, even after they opened my timeline.
> The company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds.
This strikes me as interesting, if only because it says something about the audience of Facebook users.
Sense of community aside, the reason I keep coming back to HN is because I trust that the audience will upvote stories that are interesting to me.
I guess because Facebook has such wide community demographics, it can't rely on its users to do the policing.
I don't see a problem: No software is good enough to do it on its own (AFAIK); why wouldn't they improve the quality with human editors? Every other news source does it, AFAIK.
Will there be hearings about Fox News and whether they are biased?
I do have a concern: Facebook isn't a journalism organization, run by professonal journalists with their priorities, values and expertise. They could be very manipulative; many News Corp / Rupert Murdoch publications already do this.
Lawyers, journalists (or in general humanities student) tend to be liberal in my opinion. You can employ them and expect them to be neutral. Self employed people, farmers etc. tend to be more conservative generally. You cant employ journalists and expect them to be neutral. I had always seen facebook's Trending section as "The Politically Liberal Outrage" section right from the start. I thought it was "intended to be liberal outrage" to begin with.
Algorithmic trending based on what is popular with human filters to weed out only NSFW stuff appears to be the way to go forward.
I cant blame facebook. We live in a world where if a company's experimental AI classified black people as monkeys we somehow think that is racist and make the company stop those AI efforts.
In reality a truly neutral platform would trend what is really trending, if it is racist slur so be it because then the media is truly holding mirror to the society. That is how we might get better. Else everything becomes one large safe space which is death of intellectual speech.
"Self-hosted" social networks using centralized authentication and a public API allowing for custom web and mobile clients to be built, sold, licensed etc. Moddable ranking algorithm. Monetize as a PaaS vendor for consumer and enterprise networks. I'll hold your beer if anyone wants to do it.
It's raison d'etre is to expose people to stories beyond the narrow bubble of their news feed. But apparently people don't like the idea of using human curation to get well-balanced stories in front of readers, so it would be much better if people only spent time reading cat videos in news feed.
Also, people need to make a clearer distinction between the trending topics view and news feed. They're separate products with different algorithms entirely.
PageRank was a huge leap in knowing which web pages were valuable.
I'm disappointed we don't have something similar for detecting authority, and measuring the opinions of those with authority.
For instance, if there were an article about chess, I would trust Bobby Fischer and Gary Kasparov to provide the most valuable commentary. If they say it's great, I'm more likely to read it.
But if there's an article about Jewish business owners, I don't want to see Bobby Fischer's opinion.
If there's an article about the government providing welfare and social services, I don't want to see Kasparov's opinion.
How is this not a thing for science, technology, news...?
"This technology article should scare you!" If I see that from Linus Torvalds, it's going to get my attention. From John McAfee? Not so much.
"Nvidia's latest drivers are terrible." If that's from AnimeFan2004, I don't care. If it's from John Carmack, holy crap does it deserve my attention.
More importantly, if John Carmack trusts someone about computer graphics, I'll probably trust them, too. If they trust someone else about computer graphics, there's a good chance I will, too.
We've gotten used to seeking out the opinion of Rotten Tomatoes, or the old Siskel and Ebert thumb-based-metric... It's a shame we don't have a browser extension that brings those metrics to everything we see on the web.
I'm watching the trailer for the new Marvel Civil War movie... and I see a Pop-Up-Video style bubble informing me that Kevin Smith really liked it.
I'm reading an article about how Facebook is bringing internet, but not really internet, to India, and I see EFF crapping all over it. Right there - right on that same web page, because my browser extension brings that content in for me.
I'm reading an arxiv about gene transfer in plants, and I see experts in the field saying they question the methodology.
It just sucks to me that we don't have a PageRank for authority on topics... And we don't have a way to show those scores and opinions, THAT WE TRUST, stuck to the content.
For instance, I trust Al Gore on Climate Change. Other people probably trust Donald Trump on Climate Change. It needs to be per-user to determine their authority-trust links.
Or helping determine our news feed. (To finally relate my comments back to the topic at hand.)
This is probably the worst fucking argument I've ever heard. If you're dismissing Fox News as being biased, then you need to eject all the rest of the MSM coverage too; because they're equally biased if not worse. Fuck, MSNBC doesn't even try and hide the fact they're on the Democratic team, it's practically their own propaganda tool for God's sake.
I'll gladly take a little news from a source that's not obviously pitching me the Democratic talking points night in and night out.
There is a difference in that Fox news isn't the only global news network whereas the facebook platform which delivers news among other things and through which a good percentage of people receive their news is in essence the only global platform.
So, in some ways they are a news feed monopoly because they have little competition in a space where for a significant proportion of the population it is their main news source.
On the other hand, I can see the attraction to curate/censor the news.
Let's say there is an event in some volatile country and there is news which if widely exposed would essentially pour fuel on a fire but the event which would stoke the fire is not incidental but you know could be used as an excuse for violence and result in numerous deaths. I can see the desire to keep people alive and protecting property by moderating the news, at least temporarily till things cool down.
[+] [-] salimmadjd|10 years ago|reply
On two different occasions I've noticed a similar behavior on a "trending" news topic that was not trending anywhere.
Both cases (surprisingly) that caught my eyes, had to do with vilification of Iran. The first time, I assumed a certain organized PR effort or perhaps it was naturally trending.
However, the second time (May 9th) [0] it coincided with the Gizmodo's article coming out. That prompted me to search all major news sites (with the logic that if it was on some major news sites it will then circulate within FB and will explain why it was trend).
So I decided to document this, I checked CNN, HuffingtonPost, NYTimes, WashingtonPost...no mention of an "Iranian missile test" but there was a small blurb at the bottom of the FoxNew site [1]. Not enough exposure for it to get the kind of volume to be the top-3 trending story.
[0] http://imgur.com/UNRBrRu
[1] http://imgur.com/1f13R80
[+] [-] scholia|10 years ago|reply
If you're interested in Iran, you will probably see stuff about Iran that hardly anyone else sees.
I have zero interest in Iran and have never seen anything about it.
[+] [-] onewaystreet|10 years ago|reply
[+] [-] exclusiv|10 years ago|reply
Perhaps users on FB find it more interesting and they surface it. Why does news on FB have to mimic the mainstream media?
Many times we have to read the stories about the US as reported from other countries because the MSM misses them or decides they aren't important enough.
[+] [-] ProAm|10 years ago|reply
This is how the news media has operated in the US for almost all of its existence. I wouldnt be too concerned about it, just take news you read with a grain of salt.
[+] [-] spenvo|10 years ago|reply
[+] [-] spinlock|10 years ago|reply
[+] [-] guelo|10 years ago|reply
[+] [-] notliketherest|10 years ago|reply
It's up to us, the consumer, to vote with our feet. I haven't used Facebook in years and I'm certainly glad not to be using it now that this comes to light as I disagree with it's agenda.
Does anyone have an alternative argument as to why Facebook DOES NOT have the RIGHT to suppress and promote information based on it's own agenda?
[+] [-] ta_donk_gt|10 years ago|reply
What I would compare it to though are other deceptive business marketing practices like photoshopping before and after pics for weight-loss products, or marketing your dog food as premium, all natural meat based when it is in fact the same junk, bought from the same supplier, as another dog food brand that costs half as much (and is specifically called out as inferior in commercials).
At the least, even if not criminal, it would seem to open the door to civil lawsuits.
[+] [-] scholia|10 years ago|reply
Its current "agenda" is based on reputable news sources, plus Fox News (1):
"We measure this by checking if it is leading at least 5 of the following 10 news websites: BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, Yahoo News or Yahoo."
I'd say that was a fair reflection of mainstream media.
If you object to those, what's your personal agenda based on, and how mainstream is it?
(1) Fox News has specifically said in court that it is not required to be truthful.
[+] [-] jbob2000|10 years ago|reply
[+] [-] tormeh|10 years ago|reply
The power Facebook has. Power needs to be controlled. As a lefty-ish person I'm not up in arms about this, but I see the problem.
[+] [-] at-fates-hands|10 years ago|reply
Unless you're a publicly traded company, which Facebook is. Then you are beholden to the people who are invested in your company; some of which I would assume are probably conservative and right leaning.
If FB was still a private company, then yeah, your sentiment holds true. All bets are off when you're a publicly traded and funded company. At that point, it's the investors who tell you what the direction of the company is. Don't like it? Then buyout all the shareholders and return your company back to the private realm.
[+] [-] gozur88|10 years ago|reply
It may be a little nit-picky to point it out, but in the general sense this isn't true. As soon as the company went public it stopped being Mark Zuckerburg's business and enterprise, becoming instead the property of the shareholders who employ Zuckerburg to look after their company. Zuckerburg owns less than 30%.
[+] [-] onewaystreet|10 years ago|reply
[+] [-] hackuser|10 years ago|reply
We all have legal rights to do things that are morally wrong. He has a resposibility to the world to be a responsible citizen, just as we all do.
Prior and current generations of responsible citizens are mostly responsible for Facebook: Without generations of people doing the responsible thing, even dying, for democracy, liberty, science and technlogy, education, infrastructure, law, etc. ... he'd be living in a country and world where there would be no technology, legal rights, literacy, or many other things necessary to Facebook.
[+] [-] popmystack|10 years ago|reply
This is only biased if we can provide concrete evidence that similar tabloid-esque publications that happened to align "left" of the political spectrum in the U.S. were allowed to pass while their political counterparts were passed up.
If the standard for journalism remained constant, but the sources that were "suppressed" largely happen to be conservative that doesn't seem like a problem with curation, it seems like a problem for the conservative media ecosystem.
I fully expect, despite adhering to basic scientific principles, to be downvoted yet again in this thread, because I already see a lot of the same names from the last time this article was posted and politically charged rhetoric (i.e. "leftist agenda").
[+] [-] Gratsby|10 years ago|reply
It wouldn't. Not for end users anyways. It would be a marketer's dream.
If you feel like that's the way things should be, write one. Make it popular. Sell it to Facebook. (And use the money to buy stock in your roommate's new online marketing firm)
[+] [-] auntyJemima|10 years ago|reply
[+] [-] AznHisoka|10 years ago|reply
[+] [-] gozur88|10 years ago|reply
That's true. The issue here isn't that Facebook tried and failed to implement an automatically generated news feed. The issue is they were called out for bias in the news feed, and then tried to deflect by blaming it on a computer.
[+] [-] panic|10 years ago|reply
[+] [-] newjersey|10 years ago|reply
[+] [-] dmode|10 years ago|reply
[+] [-] lazzlazzlazz|10 years ago|reply
Twitter also likely uses manual curation.
If Facebook improves an algorithmic system augmented with human consensus (that diffuses the editorial control among a group of curators), it will be the least biased of all the mentioned systems.
[+] [-] morgante|10 years ago|reply
[+] [-] jonlucc|10 years ago|reply
[+] [-] rodgerd|10 years ago|reply
[+] [-] red_admiral|10 years ago|reply
[+] [-] morgante|10 years ago|reply
There is absolutely no evidence that you can pay to get into trending.
If there were, it would be a much bigger story than this flimsy article which is rooted in the editorial judgements of a few contractors.
It really annoys me when these drive-by insinuations about "promoted content" are made. Digital media companies—including news outlets, Facebook, and Google—all actually are careful to label anything which is paid placement as such. All these baseless accusations of unlabeled paid placement do is undermine the moral standards for labeling paid placement.
If you are going to make extraordinary claims, please bring extraordinary evidence.
[+] [-] knorby|10 years ago|reply
It is also worth noting that FB trending topics can be tuned.
[+] [-] awesomerobot|10 years ago|reply
Maybe they don't frame it very well but we should probably assume people are making decisions whether its the people who are building the algorithms or hand-picking the content... someone's bias is going to work its way into there.
[+] [-] dredmorbius|10 years ago|reply
And I'm no fan (or user) of FB.
[+] [-] melle|10 years ago|reply
[+] [-] dcw303|10 years ago|reply
This strikes me as interesting, if only because it says something about the audience of Facebook users.
Sense of community aside, the reason I keep coming back to HN is because I trust that the audience will upvote stories that are interesting to me.
I guess because Facebook has such wide community demographics, it can't rely on its users to do the policing.
[+] [-] hackuser|10 years ago|reply
Will there be hearings about Fox News and whether they are biased?
I do have a concern: Facebook isn't a journalism organization, run by professonal journalists with their priorities, values and expertise. They could be very manipulative; many News Corp / Rupert Murdoch publications already do this.
[+] [-] tn13|10 years ago|reply
Algorithmic trending based on what is popular with human filters to weed out only NSFW stuff appears to be the way to go forward.
I cant blame facebook. We live in a world where if a company's experimental AI classified black people as monkeys we somehow think that is racist and make the company stop those AI efforts.
In reality a truly neutral platform would trend what is really trending, if it is racist slur so be it because then the media is truly holding mirror to the society. That is how we might get better. Else everything becomes one large safe space which is death of intellectual speech.
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] ben_jones|10 years ago|reply
[+] [-] morgante|10 years ago|reply
It's raison d'etre is to expose people to stories beyond the narrow bubble of their news feed. But apparently people don't like the idea of using human curation to get well-balanced stories in front of readers, so it would be much better if people only spent time reading cat videos in news feed.
Also, people need to make a clearer distinction between the trending topics view and news feed. They're separate products with different algorithms entirely.
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] VikingCoder|10 years ago|reply
I'm disappointed we don't have something similar for detecting authority, and measuring the opinions of those with authority.
For instance, if there were an article about chess, I would trust Bobby Fischer and Gary Kasparov to provide the most valuable commentary. If they say it's great, I'm more likely to read it.
But if there's an article about Jewish business owners, I don't want to see Bobby Fischer's opinion.
If there's an article about the government providing welfare and social services, I don't want to see Kasparov's opinion.
How is this not a thing for science, technology, news...?
"This technology article should scare you!" If I see that from Linus Torvalds, it's going to get my attention. From John McAfee? Not so much.
"Nvidia's latest drivers are terrible." If that's from AnimeFan2004, I don't care. If it's from John Carmack, holy crap does it deserve my attention.
More importantly, if John Carmack trusts someone about computer graphics, I'll probably trust them, too. If they trust someone else about computer graphics, there's a good chance I will, too.
We've gotten used to seeking out the opinion of Rotten Tomatoes, or the old Siskel and Ebert thumb-based-metric... It's a shame we don't have a browser extension that brings those metrics to everything we see on the web.
I'm watching the trailer for the new Marvel Civil War movie... and I see a Pop-Up-Video style bubble informing me that Kevin Smith really liked it.
I'm reading an article about how Facebook is bringing internet, but not really internet, to India, and I see EFF crapping all over it. Right there - right on that same web page, because my browser extension brings that content in for me.
I'm reading an arxiv about gene transfer in plants, and I see experts in the field saying they question the methodology.
It just sucks to me that we don't have a PageRank for authority on topics... And we don't have a way to show those scores and opinions, THAT WE TRUST, stuck to the content.
For instance, I trust Al Gore on Climate Change. Other people probably trust Donald Trump on Climate Change. It needs to be per-user to determine their authority-trust links.
Or helping determine our news feed. (To finally relate my comments back to the topic at hand.)
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] chris_wot|10 years ago|reply
Genuine question.
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] sjg007|10 years ago|reply
[+] [-] at-fates-hands|10 years ago|reply
I'll gladly take a little news from a source that's not obviously pitching me the Democratic talking points night in and night out.
[+] [-] mc32|10 years ago|reply
So, in some ways they are a news feed monopoly because they have little competition in a space where for a significant proportion of the population it is their main news source.
On the other hand, I can see the attraction to curate/censor the news.
Let's say there is an event in some volatile country and there is news which if widely exposed would essentially pour fuel on a fire but the event which would stoke the fire is not incidental but you know could be used as an excuse for violence and result in numerous deaths. I can see the desire to keep people alive and protecting property by moderating the news, at least temporarily till things cool down.