top | item 15389113

Google and Facebook Have Failed Us

292 points| DLay | 8 years ago |theatlantic.com | reply

179 comments

order
[+] llamataboot|8 years ago|reply
Zeynep Tufekci had a couple great columns about this recently regarding FB and the US elections this year. While I recognize that there's always a concern about who the gatekeepers get to be in the marketplace of ideas, I think this is a considerably more complex issues than "let all the information go wherever it wants and let the people sort it out"

Putting 4Chan in a top news slot is not "allowing unedited and unfiltered access to information", it is algorithmicly promoting a cesspool of disinformation into a spot that many users believe is fairly authoritative. There is no way to "not make a choice" here and let information be free. There is only figuring out ways to filter and sort the firehose of information that is now at all of our fingertips. (A situation, I should add, that human brains are not necessarily prepared for)

https://www.nytimes.com/2017/09/29/opinion/mark-zuckerberg-f...

https://www.nytimes.com/2017/09/23/opinion/sunday/facebook-a...

[+] Spivak|8 years ago|reply
> into a spot that many users believe is fairly authoritative

I mean that's the real issue. You can't be unbiased while trying to be a source of truth. You can be impartial and return results based on relevancy or you can intervene in an attempt to return the truth. Google has ventured into dangerous territory by mixing the two -- especially with their automatic snippets that answer questions.

> There is only figuring out ways to filter and sort the firehose

But it's an unsolved problem for how you put the control of this filtering into the users hands. The best we have right now is to just return everything and let the users choose what to pay attention to.

[+] kashprime|8 years ago|reply
It would be trivial for FB and Google to allow tailoring of feeds. Even keyword exclusions or subject filters would be a huge step-up from the gamified engagement-getting monster that we have now.
[+] yosito|8 years ago|reply
This is a very good point! I would also add that those with the resources to control filtering (Google, Facebook, China, Russia, the US, etc) will be the ones who ultimately control it.
[+] UnpossibleJim|8 years ago|reply
The problem comes in our choice of who should be the "gatekeepers of information"? Certainly not The Altlantic, The New Yorker (as seen sited in this discussion) or any other traditional news outlet. They have shown time and time again that the peiple running these outlets, not only, have very biased stations when presenting the news (i.e. from MSNBC to Fox News and the rainbow of bias between), they have shown, for a very long time, that these same traditional news sources are easily bought and do not handle a 24 hour news cast well, at all (look at the reports of something as simple as Tom Petty's death, yesterday, a dozen hours or so before his death). While I certainly don't trust Google or Facebook to report the news, I have trouble relying on the "gatekeepers" (blech) of the news, as well.
[+] comatose|8 years ago|reply
Asking the rich and powerful gatekeepers of the majority news sources of the public to filter content for us in a way they have been relatively reticent to do, mostly out of pure altruism, is asinine. I have no love for Facebook or Google, but the world wouldn't be a better place if they cultivated news for us, at least not for long.
[+] trgv|8 years ago|reply
I think there's already too much filtering.

People need to make up their own minds regarding what is true and what isn't, what's worth their time and what's bullshit, which links to click on and which to avoid. If people can't handle this then we need to spend more money on education, or just accept that some people disagree with us.

I don't think aggressively filtering "untrustworthy" content will lead to a net benefit.

[+] phrh8|8 years ago|reply
These platforms have hyper-optimized their products to show as much stuff as possible that people will engage with (share, click), so that they also engage with ads and stay active.

The problem is there is often some time when there is no news. No news isn't interesting, so it doesn't get shared as much as "news". Take the moments immediately following a disaster. Two types of news articles will show up:

"There was a shooting. We don't know anything yet, but we will keep you posted". Boring.

"We know who the killer is. Bobby Bobertson done it". Woah!

Now imagine you are a heavy social media user. You don't want to stay silent on such a big news story, so you want to share something showing you are engaged. Which story do you share?

(edit: removed duplicate "the problem is")

[+] pjc50|8 years ago|reply
I think at this stage it's like insisting that people filter their own water when there are organisations out there actively dumping toxic waste into it.

It's not just the Internet though, it's organisations like Infowars who are out there claiming that the Vegas shooting was some kind of "false flag", after his previous disinformation on the Sandy Hook incident.

[+] unabst|8 years ago|reply
You've summed up the age of facebook brilliantly.

> People need to make up their own minds regarding what is true and what isn't, what's worth their time and what's bullshit, which links to click on and which to avoid.

This is exactly what everyone is doing, and exactly the problem.

> we need to spend more money on education

And this is what many agree we need.

> or just accept that some people disagree with us.

And this is exactly what many have settled for since Trump took office.

> aggressively filtering "untrustworthy" content will lead to a net benefit.

It will lead to the Nightly News. But I the people have spoken already. They prefer Facebook. They prefer making up their own minds regarding what is true and what isn't.

[+] jimmytidey|8 years ago|reply
It's not cognitively possible for a user to simultaneously view every piece of content. Someone has to make a decision about what goes at the top.

Of course, users aren't passive, they could get their news elsewhere or change the filtering. But we know that defaults matter a lot. For many people, the order in which platforms present news is going to be profoundly influential.

The inescapability of choosing defaults settings, and the empirical fact that default settings do influence people a lot, taken together, imply an ethical responsibility on the part of platforms.

Making people smarter would be great - but you've got to be realistic about how effective that would be, and how long it would take.

What other usability problem would you solve by changing the people rather than the technology?

[+] altcognito|8 years ago|reply
I don't know about spending more on education, but Americans were astoundingly unprepared to even make basic differentiation between the parties, let alone sort through "bullshit".
[+] gldalmaso|8 years ago|reply
If we simply leave it up to people, we already know the outcome. Some, maybe even a lot of people will take fabricated lies as facts and proceed to engage other people with these facts, sometimes violently.

By doing that we are empowering the people who fabricate lies to move an agenda. It's ridiculously cheap to make fake news and the value gained is immense because people are having a hard time filtering it themselves. It is the social media equivalent of the Mirai Botnet.

Do we want to live in a world where the best lier wins?

[+] mc32|8 years ago|reply
As I posted elsewhere on this thread, journalism isn't exactly about the straight facts. It's always more about story and slant related to facts, sometimes more loosely than others.

For example, it will take a lot of time for investigators to get close to the truth of what happened but no one is going to wait for official reports before publishing their takes on the incident --and I'm not nor would expect that. But we also can't very well expect only objective truth to be present in news stories/articles.

Should Google FB do a better job a filtering obviously bogus information? Sure. Should they only source from a "pool" of sources in the era of citizen journalism? I don't think I want to cede that to news orgs who want to perpetuate their aura as the only legitimate purveyors of news.

[+] losteverything|8 years ago|reply
I dont know if i could disagree with the author more.

They didn't fail me. I don't want or need to know about shootings, crashes, floods, etc. Not instantly, anyway. I did not seek news in las vegas: I avoid it.

If anything the previous news gatekeepers got us used to believing that if we dont stop everything we are somehow uncaring. So plane crashes, shootings etc hold a "insta-pass" to news. All hands on deck approach.

To me, the authors premise of FB and G as bad gatekeepers is wrong. Information is hardly ever right right away. Nobody i know expects that. Even with that, i believe people know the difference between a photo sharing site, a email and search/map site and a news organization.

[+] dasil003|8 years ago|reply
Well Google has a top-level subdomain news.google.com that serves a similar function to a collection of newspaper section front pages. I don't think it's unreasonable to suggest they have some responsibility for what they post. As engineers we are too quick to give them a pass because it's algorithmic, and human intervention wouldn't scale. However it's worth remembering that Google is not a startup, it's one of the biggest and most powerful companies in the world, we shouldn't treat them like an engineering org on a shoestring budget. You might not care personally or have any expectations, but that doesn't mean we are out of line demanding solutions to difficult problems from companies that make such profit off of our information.
[+] abrahamepton|8 years ago|reply
I worked at Google News for 5 years, and completely agree w/Alexis. Google promoted 4Chan as a top result for people who wanted to know about Las Vegas; that's a fail, period. It is a fail. It should not have happened; it should not happen again; it should never happen.

Great, maybe you personally weren't fooled. That's entirely and totally irrelevant since there are 7 billion people on the planet and Google (and FB and everyone else) wants to serve all of them. Not all of whom are exactly like you.

Google fucked up, admitted it (kinda) and needs to do better. Period. Not hard to understand.

[+] dkarl|8 years ago|reply
> To me, the authors premise of FB and G as bad gatekeepers is wrong. Information is hardly ever right right away. Nobody i know expects that. Even with that, i believe people know the difference between a photo sharing site, a email and search/map site and a news organization.

I think Facebook does a decent job of presenting a news feed driven by your "friends." Not that it's a good thing (it's certainly not in most cases) but by and large people are seeing the good and bad information that their friends are promoting, which is what they want, because they want to maintain their social awareness and connectedness more than they want to be informed. Facebook also does a pretty good job of indicating the social source of the news, so you can evaluate it in the light of your "friend"'s credibility. Facebook just lets people be people, with all the horrible implications of that.

Google straight-up presents articles as "news" and then disclaims any responsibility for it, on the implicit argument that a company with tens of thousands of employees, billions in market cap, and one of the most visible platforms in the world shouldn't be held to the same standard as a small city newspaper because they're using algorithms. They direct blame to their algorithms and then argue that algorithms are inherently blameless, a trick that we keep letting them get away with. Their excuse of "it's just algorithms" is as cheap and meaningless as "it's just for the lulz." They don't say "here's the news chosen by an algorithm which of course often makes horrific mistakes a human never would." They present "Google News" with layout and sections just like a professionally produced news site, until they screw up, and then they say, "Well, of course it's not journalism, don't make that mistake. It's just algorithms. Didn't you notice the Google colors and subdued graphic design? We're not trying to fool anybody."

Their take this time:

Within hours, the 4chan story was algorithmically replaced by relevant results.

Ah, so everything worked as designed. No need for humans to take responsibility. Just stand back and let the system work. Again, a small city newspaper that left something like that up for hours would be condemned for callousness and suspected of intentionally promoting disinformation. Google seems to be on its way to arguing that its systems either cannot or should not be overridden by humans, even to correct horrific mistakes, if it's not there already.

[+] chourobin|8 years ago|reply
You overestimate the intelligence of the general American public.
[+] jasode|8 years ago|reply
I agree and the author, Alex Madrigal, seems to be ignorant to the history of the dissemination of misinformation. He commits the same error as other journalists complaining about Google/Facebook algorithms by not mentioning their "human judgement" within news orgs causes similar problems.

Let's look at how the news publications handled the "truthiness" of Iraq WMD:

- No skepticism that WMD exists: September 2002 article (WMD fever 6 months before the March 2003 invasion of Iraq)[1].

- WMD not found: January 2004 article (10 months after the invasion when soldiers found nothing.)[2]

Before March 2003, all the influential news orgs in USA such as NYTimes, Washington Post, and all 4 major tv networks pandered to the Bush administration's "slam dunk" narrative for Iraq having WMD.

You can't make excuses for those USA news orgs and say they were tricked. That's because the news outlets in Europe (especially the journalists in France and Sweden) were more skeptical about WMD and wrote that attacking Iraq would be a big mistake.

It's fascinating how the journalists write as if they're on a perch of higher judgement over the mass of helpless readers when they themselves fall victims to the same scams of misinformation and spread that misinformation to audiences just like bad Google algorithms.

(See documentaries "War Made Easy"[3] and "Buying the War"[4] that show how journalists lack independent critical judgement and fall in lockstep to spread misinformation.)

>There’s no hiding behind algorithms anymore. The problems cannot be minimized. The machines have shown they are not up to the task of dealing with rare, breaking news events, and it is unlikely that they will be in the near future. More humans must be added to the decision-making process, and the sooner the better.

But humans and their judgement are not even up to the task of slow-moving news like the buildup to war. For the Iraq fiasco, maybe the better solution in 2003 is "more algorithms" in such that USA readers would end up in a 50/50 split with half believing Iraq had WMD and the other half remaining skeptical. Instead, the polls showed that majority of Americans believed Iraq had WMD.[5] How did they get that wrong belief?!? The mainstream news, that's how. (Facebook "fake news" didn't exist in 2003.)

Yes, Google needs better algorithms; but I'm not convinced that people like Alex or government officials should be the ones regulating the algorithms. They have a proven track history of being as flawed as the readers they're trying to inform with the "truth". (Thought experiment: if Google search results in Feb 2003 had returned more results saying "no proof Iraq has WMD", American journalists and the government would have criticized Google's algorithm for spreading "fake news"!)

[1] http://www.nytimes.com/2002/09/08/world/threats-responses-ir...

[2] https://www.theatlantic.com/magazine/archive/2004/01/spies-l...

[3] https://www.youtube.com/watch?v=R9DjSg6l9Vs

[4] https://www.youtube.com/watch?v=0KzYL6e3sV0

[5] http://news.gallup.com/poll/8623/americans-still-think-iraq-...

[+] azangru|8 years ago|reply
Came here to say pretty much this :-)
[+] folksinger|8 years ago|reply
The reason why those of us on this forum can tell the difference between good and bad information is because we received a good education and learned critical analysis skills.

Facebook, Google and the internet in general are an objectively bad source of knowledge. They are subjectively good sources of knowledge.

The vast majority of people do not have the skills to use the internet as a source for good knowledge.

The contents of the average library contain better knowledge than the internet. The contents of a library at a world-class University and even more-so.

The internet is the most overrated institution of contemporary society when it comes to obtaining knowledge.

It is somewhat good for entertainment but probably not better than your average game of Dungeons and Dragons.

[+] jordigh|8 years ago|reply
> The reason why those of us on this forum can tell the difference between good and bad information is because we received a good education and learned critical analysis skills.

We really need to stop congratulating ourselves on how smart we are and we would never fall for such blatant lies, because we are such highly-educated critical thinkers. There are a bunch of silly beliefs that we witness even amongst us like Steve Jobs wanting to cure cancer with unorthodox dietary changes, that having no gender parity in tech is both good and natural, or this medical doctor with a respected career who believes the earth is flat:

https://www.gizmodo.com.au/2017/03/the-men-who-believe-the-e...

We're all vulnerable to believe and defend the silliest things and then call ourselves critical thinkers while we do so. We can all be tricked. There are well-documented, effective methods to make us perceive and believe any number of things. Illusionists and magicians make a living out of doing this overtly; journalists and politicians do so less transparently.

Quite frankly, half the time I'm walking around slightly horrified that perhaps I'm believing and defending the stupidest lies and I think everyone else should feel the same way at least part of the time.

[+] lhuser123|8 years ago|reply
> The vast majority of people do not have the skills to use the internet as a source for good knowledge.

I’m no expert, but I see this too often. People who truly believe fake news and bot comments on Facebook. Other clicking on adds because can’t differentiate them from real Google search results. Many that doesn’t have the patience to even think about verifying the information. Humans are going to do human things, and these companies know that very well.

[+] mythrwy|8 years ago|reply
"The vast majority of people do not have the skills to use the internet as a source for good knowledge."

And this is known how?

The aristocracy considers the people ignorant and prefers it that way, but history shows this is not always objectively so.

The ideas of freedom of information, of freedom of expression are repugnant to the aristocrat (who always consider they know best), but these are founding principals of so far the most innovative and productive societies history has known. As opposed to aristocracies which are feudalism and stagnation.

“Scratch a conservative and you find someone who prefers the past over any future. Scratch a liberal and find a closet aristocrat. It’s true! Liberal governments always develop into aristocracies. The bureaucracies betray the true intent of people who form such governments. Right from the first, the little people who formed the governments which promised to equalize the social burdens found themselves suddenly in the hands of bureaucratic aristocracies. Of course, all bureaucracies follow this pattern, but what a hypocrisy to find this even under a communized banner. Ahhh, well, if patterns teach me anything it’s that patterns are repeated. ― Frank Herbert, God Emperor of Dune

[+] xvilka|8 years ago|reply
It was like this in the time of every morning, midday and evening newspaper as well. Still not a reason to introduce, moreover to actively encourage censorship. While this particular case may be true, those in power will just use this precedent as an example, more and more. It is like a tiny hole in water barrier, at first it is small, then it will grow larger as some people would want to justify more "filtering", more and more, until the water will destroy any barrier, and full featured censorship will come. This is already a case, but still can probably repaired back, we will see.
[+] ominous|8 years ago|reply
> The vast majority of people do not have the skills to use the internet as a source for good knowledge.

We (and others, this is not about us) are wary enough that its consequences are dampened. Shielded because we know, or know enough, what Google and Facebook are. Algorithms on top of data maximizing popularity metrics. And popularity is taken so seriously as to be seen as a measure of truth (see: time spent watching any entertainment series 4th rerun). Be it mind share, clicks, network reach or whatever other criteria that can be deemed to convey content value, "significance and truth" is not it. There is no analytic measure of significance and truth. Even less so one valid across 3.7 billion internet users (value from a quick search, not to be taken too seriously as not all networks are connected, not all content will reach everywhere).

Facebook is not a network of friends. It is a graph of connections whose content acts as bait to keep you navigating it. It preys, displaying notoriety as the utmost standard, and equating interaction with success. Reality pales against the interface of social media, that openly displays number of likes, comments, shares, friends, stats we only catch a glimpse of outside. Your social media past is an absolute, like your current weight or your current number of kidneys, there's nothing more to it. There is no wiggle space. It is a record of your path that allows no perspective. There is only that one photo, which you carefully picked. On that day you shared that link, and that was your day. That is what you are for the network, and nothing more.

As for Google, you are not navigating truth. At most, you are navigating utility, as measured by ranking. There is no truth in search order, nor should you expect to find it there. You still need to think. Why was 4chan first? Something to do with being first, being active, being inflammatory, being accessed often?

From the article:

> 4chan results, they said, had not shown up for general searches about Las Vegas, but only for the name of the misidentified shooter. The reason the 4chan forum post showed up was that it was “fresh” and there were relatively few searches for the falsely accused man. Basically, the algorithms controlling what to show didn’t have a lot to go on, and when something new popped up as searches for the name were ramping up, it was happy to slot it as the first result.

There. If you search from something that 4chan talked about, you get 4chan results. Nevermind that you don't know what 4chan is, but it does, and it creates content! This, the article says, is a problem. The internet is seen as a stream of facts and when the common man discovers they were fed crap (in their deeply biased opinion, and if they ever notice at all), they complain to the stream provider! Instead of admitting "damn, I am stupid for falling for that, I should know better by now. What can I learn? Well, let me add 4chan to a filter, or learn what that place is so I can avoid it". Google hiked for weeks to a remote location, found the densest of jungles and entered a small opening, with a small swamp in the middle. Swam in the putrid water and found something akin to culture. There, in one of their rituals, he found your search term. Took a picture, and brought it to you. And you expected it to conform to your ideals of truth. There, as in other places, fringe thoughts are expressed, both real and fake[1]. But the article writer wanted Google, the tap he let open in the comfort of his living room, right in the middle of his truth of existence, to tell him HIS truth. But instead he got doxxing gone bad, users posting in epic threads and probably someone from /r/4chan asking to be included in the screenshot, as others scrambled to archive the thread, or derail it with spiderman pics, or even pretend-child-pron so the mods would delete the thread so as not to call the attention of "normies" once the outside world knows about the latest ritual they found: blaming Geary Danley for what happened in Vegas. Most of them aren't even in Vegas, or care enough about shootings to see the bigger picture. But they found lulz to be had.

And sure enough, here it is, in the article's conclusion:

> There’s no hiding behind algorithms anymore. The problems cannot be minimized. The machines have shown they are not up to the task of dealing with rare, breaking news events, and it is unlikely that they will be in the near future. More humans must be added to the decision-making process, and the sooner the better.

"The problems". But what are the problems? "The machines (...) are not up to the task (...)" Ok. Now we know that. What next? "More humans must be added to the decision-making process".

No! A thousand times no! Don't touch information, don't shape the stream! 4chan exists, and is made of people! And you want to have PEOPLE (not the same ones, but also prone to their own rituals and biases) to moderate the stream to the outside world! There are a lot of voices in every controversial (and controversial varies a lot) source, 4chan is one of them! And the solution is to make the stream as one sided as possible, neutered, comfortable only to some?

We hold two things in your hands:

1) mind can believe and say pretty much anything

2) internet connects everything: basements to penthouses to marketing departments to psyops operatives to deep swampy lands full of dark terrors to your children

Lets ignore that, pick a select few and filter the stream. And guess what, there's a bit of swamp in all of us. [2]

After what seems like a troll comment here[0], here's a comeback from a concerned citizen:

> you're aware it's unwise to make those kinds of statements on the Web? Especially when you "Work" for a university?

This is how seriously people take the internet. And the solution is to keep out the weirdos. Or ask them to keep the weirdness inside. Good thing we (the writer of the article) are in charge. Let's just clean up google and social media. Because that is the full extent of the internet. The swamp will disappear as soon as we cleanup google, and the children are safe again.

Google and facebook are doing just fine. We are failing to build defenses for this globalism of the mind.

[0]: https://techcrunch.com/2017/10/02/how-reports-from-4chan-on-...

[1]: https://www.reddit.com/r/OutOfTheLoop/comments/5xdacv/what_i...

[2]: https://www.goodreads.com/quotes/548087-no-tree-it-is-said-c...

[+] JustSomeNobody|8 years ago|reply
Intelligence does NOT prevent one from falling for junk news. I know way to many intelligent people who believe ridiculous stuff for this to in any way be true.
[+] rayalez|8 years ago|reply
Yes, it takes skill to obtain information from the internet, just like it takes skill to obtain information from reading books in the library(you have to know how to read).

If you lack these skills, you are limited to the comicbook section in the library, and to cat videos on the internet.

Internet is a much superior tool of obtaining information, obviously. Just because it takes skill to use it, doesn't mean that it's worse.

[+] mc32|8 years ago|reply
Google is more a conduit of news rather than a gatekeeper. FB is both a gatekeeper and a conduit to news as well as a news producer (since it pays or underwrites some orgs to produce news for it).

That said, I think The Atlantic is wrong in placing blame on Google when they themselves engage in questionable journalism (clickbait as well as often biased news-source).

There is no way we're going to come up with something "objectively truthful" because, news as it exists today is not about facts. It's journalism and journalism has always been about putting a slant on things --quoting unauthoritative sources ("man named Joe claims," etc.) loose factchecking, and more. All to sell more news (packaged with ads or for subscriptions, where you don't want to alienate your subscriber "base".

This is an attempted landgrab by establishment news orgs to corner what is news. They want to be the only sanctioned sources of news and punish alternative sources --using an obvious outlier to promote their pov.

[+] hsod|8 years ago|reply
Google is all about filtering. It doesn't just show you a random selection of every page on the internet containing your search term. Instead it applies subjective criteria to present you with what Google thinks would be the "best" results for you.

This is why people use Google.

So I don't understand when people make the generic argument against filtering, that filtering is categorically bad and Google shouldn't be in the business of deciding what we see and don't see. That's the whole point of Google!

[+] holmak|8 years ago|reply
It's rather bold for the Atlantic to complain about low quality news when they have clickbait garbage at the bottom of every article.
[+] reissbaker|8 years ago|reply
Not 100% surprising that journalists would view the machines that have wrested control of the news from them with hostility. Facebook definitely could use improvements, and it sounds like Google's search ranking in these rare kinds of situations could too, but the vitriol here — and the suggestion to replace machine ranking with ... surprise, journalists! — makes it seem like the "Us" who've been failed by ranking algorithms might specifically refer to The Atlantic.
[+] mythrwy|8 years ago|reply
Google and Facebook have failed the Atlantic which is seizing on tragedy to imply only sanctioned media (them of course) should be available through Internet searches.

However most of the rest of us are not failed as we already know results from 4-Chan probably aren't accurate.

Please stop failing all of us with this desperate bid to get back in control of information dispersal there The Atlantic. I'd much rather see a smattering of nonsense and be able to make that determination for myself rather than you deciding what should be available for viewing.

[+] partisan|8 years ago|reply
The article is placing the blame on Google and Facebook when it should be placed on us, the "consumers" of media. We need to take a look in the mirror and understand who we are and what we want.

Ask yourself. Really ask yourself:

- What kind of information am I looking for and why? Am I just a digital rubbernecker? Do I want to break this story at the watercolor? Or do I need to know if a loved one was hurt?

- Do I need to know shortly after an event occurs or am I able to tolerate a little bit of delay in my gratification?

Shortly after an event, there are many stories and they haven't yet been synthesized into one narrative. There are people who want to capitalize on tragedy and their voices are part of the mix. Over time, the noise is stripped away and what we get is the official narrative and that is one we can "count" on. If you are willing and able to wait until the official narrative comes out then a) congratulations, you have will power unheard of in this day and age, and b) you get the benefit of skipping all of the intermediary chaos states.

If you understand why you need to know and when you need to know it, then you can control the quality of the news you expose yourself to.

[+] pdonis|8 years ago|reply
tl/dr: We don't think people are capable of exercising judgment, so we want the computers at Google and Facebook to do it for them. Good luck with that.
[+] RestlessMind|8 years ago|reply
> Worse, when I asked Google about this, and indicated why I thought it was a severe problem, they sent back boilerplate.

> Unfortunately, early this morning we were briefly surfacing an inaccurate 4chan website in our Search results for a small number of queries. Within hours, the 4chan story was algorithmically replaced by relevant results. This should not have appeared for any queries, and we’ll continue to make algorithmic improvements to prevent this from happening in the future.

I believe Google can be lax about quality of results because it has no real competitor. These are the moments when I wish there was a healthy competition among search engines, so that all of them would be on their toes to deliver best results.

Having said that, did anyone try DDG or Bing during the initial period after the event? How was the quality of the news results on those sites?

[+] timmytwotime|8 years ago|reply
The real crime is that there is an assumption of companies with their own self-interests at the forefront are somehow benevolent arbiters of information on the Internet. This shows more the naive crassness of the media.
[+] crag|8 years ago|reply
I agree with the article. I think more humans do need to be added to the "decision-making" process.

I mean, com'on, promoting 4chan to the top spot in a news channel? Too many people in this country don't bother to check, they figure if it's the top article it must be true.

[+] arkh|8 years ago|reply
> I agree with the article. I think more humans do need to be added to the "decision-making" process.

Not really useful when most journalists already demonstrate how even professionals can't check their sources.

[+] aaron695|8 years ago|reply
Reddit and 4chan had more informative information quicker than all the news outlets.

They did not mislead me on anything about the incident. (They did have misleading comments that were easy to ignore)

Sorry, but just because there are dumb people in the world why should my results on Goolge be censored.

It's also bullshit, 4chan is very well know as a poor source and looks nothing like a newspaper so the concept it would be mistaken for a fact based news outlet shows more how dumb the writer and people with this belief are more than anything else.

This writer is so superior they have to save us dummies from facebook groups and 4chan? I find it insulting.

[+] saimiam|8 years ago|reply
If we all such smart people/techies, can we spitball a better breaking news feed algorithm into existence?

I'll kick it off -

1. Focus all news on Who? What? When? Where? Omit the "Why?".

2. When in doubt cite the primary news sources (police feeds/courtrooms/local authorities/opposition leaders/pictures/geo tagged and timestamped audio&video feeds with geo tags)

3. Secondary news sources like news orgs and journalist videos are second in importance

4. Quote tertiary news outlets like Reddit/HN/4chan only when they can be cross verified with a primary source

More suggestions?

[+] bo1024|8 years ago|reply
Just a comment:

Reputation is key.

Having studied game theory for systems of agents producing results and content, I feel fairly confident in this conclusion: the only reliable long-term design to incentivize beneficient, truthful behavior involves assigning good reputation in return for good behavior, and assigning future rewards/attention based on past reputation.

Based on this, I postulate that any good news feed algorithm must use reputation scores based on past history to choose what to believe and what to show when a new story comes out. Humans of course do this instinctively, which many modern news organizations will have discovered too late as they sell out their integrity, accuracy, and priorities.

Remember what von Neumann said about arithmetic attempts to generate random numbers? Similarly, anyone who attempts to algorithmically determine "truth" based on instantaneous metrics is living in a state of sin.

[+] have_faith|8 years ago|reply
I've been toying with the idea of building a meta-news site of sorts for a while. Only one page per story (as opposed to the scatter-shot approach of normal news sites), stick just to facts, don't report death tolls while the bodies are still warm, anti-sensationalist and so on. Just curate a list of primary and secondary sources for a given story and present an overview of the story with opposing viewpoints. Very dry from a presentation perspective.
[+] throw2016|8 years ago|reply
Since Google and co were never given the responsibility of telling anyone the truth this seems to be another naked powergrab to spread FUD and scare stories so someone can have a 'monopoly on truth'.

Diversity of opinion and widespread rumour mongering is a natural state of affairs in human society. Those who are unsettled by this and seek some sort of uniformity and control betray a troubling megalomania. Elevating yourself over others to decide somehow only 'you' are able to discern the facts confirms it.

That leads to not to Google or Facebook curating news but 'one source of truth' controlled by the government and authoritarian entities, using the exact same logic. Some people may be experts at C and Go but betray an astonishing illiteracy when it comes to history, freedom and evolution of human society.

Notions of truth have been debated for eons so there is already a large body of knowledge. You need an educated literate population and accept people will have wildly differing views and trying to protect the 'ignorant' not only turns your society into a tightly controlled cage but reflects a streak of authoritarianism and megalomania in the 'protectors,' now prevalent among many technical folks.

[+] eighthnate|8 years ago|reply
"In the crucial early hours after the Las Vegas mass shooting, it happened again: Hoaxes, completely unverified rumors, failed witch hunts, and blatant falsehoods spread across the internet."

It's the nature of the internet and social media. People rant, rave and gossip. It's not google or facebook's job to curate what people say, think or do.

I wish google and facebook and the social media companies would unite to fight back against traditional media. They've been attacked for the past couple of years relentlessly.

> their active role in damaging the quality of information reaching the public.

With all due respect. If google/fb/social media wanted to filter out quality of information, they would ban theatlantic and the rest of the traditional media.

At the end of the day, this guy is just complaining that people are being people. Eventually things get sorted out.

Looking at the guy's stories list, it seems like all he does is whine about facebook and social media.

https://www.theatlantic.com/author/alexis-madrigal/

Who is alexis madrigal that we should even pay attention to him?

[+] HillaryBriss|8 years ago|reply
> Gabe Rivera, who runs a tech-news service called Techmeme that uses humans and algorithms to identify important stories ...

wait a second, you mean a 3rd party site can do a better job at filtering out fake news than Google/Facebook? and people can instead just go to a 3rd party site for high quality news?

but The Atlantic thinks Google/Facebook should have all the power and ace out the little 3rd party companies who offer a better product?

waaaaah?!