Why just conspiracies ? Why not cults ? Political speeches ? Nutrition ? Science ?
And who decide what's a conspiracy ?
After all, we were regarded as paranoid lunatics wearing tinfoil hats about the NSA surveillance, until PRISM came out and it became obvious we weren't.
Before Bamford, there was little in public about the NSA. ECHELON was total conspiracy theory.
So is the Israeli attack on USS Liberty during the Six-Day War still a conspiracy theory? I guess not, because there's a Wikipedia page. But it was denied for many years.
I don't think you would have been able to find any information on Wikipedia about proofs that "the NSA spies on everyone" is false. Anti-vaxers conspiracies, though, to pick an example, has plenty of public info on why they're wrong.
It's a first step to see if it's feasible according to the article:
> Wojcicki did not specify how many conspiracies were on the company’s initial list, but suggested it would expand over time. “What I like about this unit is that it’s actually pretty extensible,” she said. The company could show additional alternate sources of information under controversial elements in the future, she said.
I think you're a bit optimistic when you use the word "obvious". Some people still look at me as if I'm wearing a tinfoil hat whenever I bring up NSA surveillance.
YouTube should add Wikipedia citations next to every video. It should be an information resource and not source of disinformation.
YouTube also needs to fix search so that results are not full of computer-generated spam results. It's very surprising how poorly YouTube has been developed. The search is still embarrassingly bad after over a decade.
Wikipedia admins with most patience and energy to participate in edit wars. Wikipedia admins also decide who else becomes a new Wikipedia admin and who gets banned, so once a group of people with same opinions gets a majority among the Wikipedia admins, their opinion becomes fixed as the official "truth".
Automatically adding Wikipedia information to YouTube videos will only increase the importance of the edit wars.
There are some absolute lunatics posting their content out there, but that will always be the case. I don't think it's a problem that really needs solving. Even if it was, I don't think that this is a solution. If you are too quick to counter a statement, people may suspect that you have ulterior motives. If videos come along with these "corrections" many viewers will see that as proof that they're right and that the "mainstream" is trying to cover something up.
A video is a creative production. The creator makes it in a certain way for a certain reason. When the publisher interjects, tries to mold and reshape the content, provides their own commentary with it, that restricts the creator's artistic freedom over their content.
I don't expect that this pseudo-censorship (that's what it is - they're saying "we don't think your content is right, so we'll be giving your viewers what we think is right) will be applied neutrally, either. There are nutcases in all ideological directions, but I have a solid suspicion that these annotations will only be applied to those that are not in line with the "politically correct" zeitgeist.
Of course, YouTube is well within their rights to do whatever they want with their platform. This isn't at all an uncharacteristic development for them, either. They've consistently demonstrated that their only interest is money; they don't care for the small creators - or really even the viewers - all that much. If this is what keeps the advertisers happy then it's only the logical step for them.
1. Definition of what is a conspiracy and what is not. There are some hot topics out there all the time (e.g. Trump and Russia collusion) - pointing out and not pointing out "conspiracies" is a statement within itself. You only have to think about Twitter and it's "verified accounts" hot water not so long ago [1].
2. Adding fuel to the fire - for some conspiracies the idea of covering them up makes them more real for many people. There are many posts on various social media platforms with the tag line "share this before <X> takes it down" (e.g. Facebook videos [2]).
3. Videos that debunk theories could also get thrown into this automated mess they'll undoubtedly make and will probably incur demonetization. The result of that may be less debunking videos, actually poisoning the well further.
4. Generally squashing freedom of speech means that needed discussion on a particular topic can't happen.
What I think they should do:
1. Have this as an opt-out feature which is mostly hidden, or an opt-in feature which is obvious.
2. Pick controversial topics of any kind and find for and against cases for them, allowing the user to investigate them both. Picking a side is dangerous.
3. Have smart humans go through and carefully select for and against cases - don't try to automate this as it will be gamed.
4. Don't use this information for classifying whether videos should be monetized.
The most glaring problem is that it simply doesn't work.
The only people who take conspiracy videos seriously, are people who have already decided they don't like the truth, and either want someone to agree with them, or want someone to invent the details/logic so they don't have to.
You're not dealing with people who aren't aware of the moon landings. You're dealing with people who have already chosen, for whatever reason, not to believe it.
Have you ever won an argument about a moon hoaxer with "yes, they did", or against a flat-earther with "no, it's round"? Embedding wiki summaries into youtube is simply taking the argument that never, ever works, and automating it.
"You cannot reason someone out of a position they did not reason themselves into."
I don't agree with comments which assert that proper, well-sourced, sober information can't change minds among people seeking videos about fringe topics.
I also think that Wikipedia is a great resource for exactly this use case, especially where a small number of easily observable facts can overturn the thrust of the disinformation (think flat earth or anti-vax).
However, in some cases, it might not be as good a resource for this purpose. For example, here's the opening sentence in the article about Lee Harvey Oswald:
> Lee Harvey Oswald (October 18, 1939 – November 24, 1963) was an American former Marine and Marxist who assassinated United States President John F. Kennedy on November 22, 1963.
Of course Oswald enjoyed no trial, and the evidence against him is very flimsy. I (and I think many Americans who've studied the topic) don't think there's enough evidence against him to make such a brash declaration.
So, in this case (and I suspect, some others), Wikipedia serves to add bias rather than subtract it. I think that a sober, reasonable article on Oswald will offer a more nuanced and more factually-driven article.
I think it's especially important in these politically acrimonious times to investigate the Kennedy assassination, and the obvious conclusion from the evidence is that actors within our government executed a chief executive. Let's not misdirect people of a younger generation who try to study this topic.
Other articles on Wikipedia concerning the assassination go quite heavily into the conspiracy theory side (and there are dozens of them). I'd find it odd for someone to only read the Lee Harvey Oswald article lead and make their conclusions off of that.
Google will clearly be able to influence bias depending on what article they link to.
Adding a Wikipedia box won't make Alex Jones viewers changr their mind about gay frogs, and could play into his victim complex. If they want to get serious, change the algorithm so it won't reward people like jones. That's the problem. You watch one dumb video, let it autoplay more, and next you know you're in a rabbithole of crazy. One night i fell asleep watching a vegetarian cooking show, i wake up and youtubes recommending some raw food nutcase telling me why cooked food is poison and how vaccines cause autism (true story)
If there is never any reason to not believe the official narrative on every event then why not just censor any evidence to the contrary? This seems to be the direction we are going.
There's a bit of doublethink here though. Does propaganda work? Well of course it doesn't because the truth will always come out because the news wants viewers and thus the official story is never propaganda! But, wait.. These conspiracy theory videos made by some random dude in his bedroom are convincing some people of things that are apparently not true! How did they do that if propaganda doesn't work? Explanations about Russian troll bots with magical mystery deception tactics and various handwaving ensues...
So does propaganda work? If not then we don't have to worry about fake news. If it does, then how do we know the real news isn't propaganda? What empowers propaganda more than anything? Lack of criticism.
There is a serious blind spot most people have, or at least publicly act like they have to avoid ostracism, when it comes to challenging the mainstream narrative. Thank you for pointing it out directly. Everyone else is dancing around the gorilla in the room.
Won't this just take the battleground about what is "fact" or not to Wikipedia?
If someone was trying to discredit say, Obama, and saw "Barrack Obama was born in the United States - according to Wikipedia" surely they'd rush there and try to sneak conspiracy theories past the moderators.
So what happens then, Wiki has to lock down more and more pages, or effectively only allow approved edits to topics people are trying to shill over.
That happens, and it's no longer the crowd-sourced/accepted truth, it's Wikipedia moderators truth.
When pages are locked, they're not limited to only admins (who I'm assuming you are referring to by "moderators"; otherwise, any user can undo an edit, and it doesn't require an account). There are several levels of page protection: [0]
* Pending changes protection: Anyone can edit, but edits must first be reviewed before becoming visible. Autoconfirmed [1] users have their edits go through right away, while other edits have to be approved by a pending changes reviewer. The criteria for reviewing edits basically boils down to "anything that is not vandalism" is reviewed and then it's treated like a normal edit that can be contested by other users. In effect it reduces the incentive to vandalize or make disruptive edits because they won't even be visible.
* Semi-protection: Only autoconfirmed users can edit. See [1]. This is used for pages where a large proportion of the edits are vandalism rather than helpful edits. 99% of vandalism stops here because most vandals are too lazy to get autoconfirmed or don't know how, since their goal isn't really to be productive.
* Extended confirmed protection: This is a fairly new option and requires an account 30 days old and with at least 500 edits. This is mainly used for pages under sanctions (namely the Israeli-Arab conflict), and in certain cases where autoconfirmed users are causing abuse.
* Full protection: Very rarely used. Mainly used in instances where multiples experienced editors are editing disruptively on a page such as by edit warring. The protection is always very short because it otherwise blocks editing from all users. In most cases though, the editors themselves are usually blocked or warned instead of a page necessitating full protection.
> Won't this just take the battleground about what is "fact" or not to Wikipedia?
Wikipedia isn't ad-driven. That reduces its incentive to be addictive by any means necessary. It's also been arbitering truth fairly effectively for many years.
If you try to squash conspiracy theories like this there will be those free-speech hardliners who see it as crossing a line.
If you allow the crazy conspiracy stuff then there will be people who complain about the platform being a toxic influence and promoting hate speech etc.
Personally I don't see anything wrong with attaching additional information like this as opposed to outright censorship or something. But my guess is that plenty of other people will complain.
Side note: I can't help but notice how polarizing issues involving free speech have become in recent years. Any topics involving perceived censorship by any host simply ignite.
You don't get "free speech" on Youtube. Its not a government censoring you. Its a private company doing whatever they want on their private website. Same with Facebook, Twitter, etc. The only public spaces online are decentralized ones, and even then you are subject to nodes actually accepting your peerage. If every Mastodon instance deems yours toxic and blocks it you can't cry fowl of censorship to anyone.
Now is as good a time as any to mention that I’ve been running a channel that I am moving from YouTube onto my soverign origin RasPi with some fancy scripts, and I’ve been doing daily 720p livestreams using it as the origin with CloudFlare CDN and syndicating out to 1-4 other services (YT, Twitch, FB, Periscope dep. on message). Github my username, tinydatacenter project.
Filling a void that I saw growing—we need open source systems on the level of the big guys; this is that. I am a disgruntled American being pushed out of jobs in SF and feel I am being censored on at least one platform and feel my service family member rolling in their graves recently.
Youtube already takes down anything they dislike.
Definitions such as "hate speech" include literally all possible speech that is critical of anything, at any degree.
Their preferred weapon though is de-ranking. This is the American model of censorship, and it really does work. Why bother going after people who speak up, when instead you can make sure nobody can hear them? You think the video is up, you can share it with your close friends, but nobody else will discover it. It's also super convenient for them because they move the majority of content in coldline storage almost right away. It's not just Youtube either, it's Google, Facebook, Twitter and all their subsidiaries. We need a massive change in the way the internet is structured, both technically and legally, if we want the open web back.
> Youtube already takes down anything they dislike.
These are the funniest comments on HN. There seems to be a huge group of people that are more willing to believe that YouTube will push a narrative vs believing YouTube just wants to earn more profit.
If you make a video that people want to watch the algorithm will love you. It determines this by watch time and session time. If you make a video that people click away from, especially if it ends their watch session, the algorithm hates you. Remember, you have to compete against over 300 hours of content being uploaded every minute.
This is how every social platform works. It’s a function of content volume.
If they took down every video they didn’t like PewDiePie wouldn’t be the top channel, literally making fun of the platform almost every weekday. But most people on HN haven’t actually bothered watching him, I’d hazard to guess!
This is just stacking one bias on top of another. Wikipedia is not a credible source of information, especially when it comes to topics like politics or conspiracies, and refering to it selectively is a clear statement on its own.
IMHO it would be much more productive to tell people to be extremely sceptical towards everything they see on the internet. YouTube is very capable of launching a successful campaign to do that. I'm sure a lot of creators would jump on board, and, seeing what's most likely to go viral these days, it seems necessary. This is obviously in direct conflict with the business model of platforms like YouTube, Twitter, and Facebook, since they rely on ad ravenue, so we're stuck with orwellian solutions like the one presented.
I'm not sure how I feel about this. Correcting obvious falsehoods seems fine, but something about a platform moving from being (_vaguely_) neutral to directly critiquing content feels off. Further, where do you draw the line? Correcting things that are unequivocally, proven wrong casts a small net, but adding information to loose speculation, or discuss of events that are denied but not disproven, risks making a stronger statement than the evidence allows and risks shutting down legitimate discussion.
A major global platform announces it will link out-of-platform to a community edited resource. That should be seen as major win to the open web and pretty much everyone else. But somehow YouTube manages to frame it in a way that no one feels comfortable about it.
I can safely estimate that my daily life and my work have me deeper in the YouTube community than the average HN poster, and I say this is GOOD.
YouTube isn’t censoring here, just making sure that the most despicable users cannot lead astray the most vulnerable minds without at least a warning label.
If you think that warning label is too influential you should consider the influence of well paid, highly watched liars on YouTube.
I've thought about doing something similar before, but wikipedia is oriented toward summary, not to convince people who are dubious of some "mainstream" truth.
If you want to, say, convince someone that is watching Holocaust denial videos that it did happen, I'd actually dig into some past reddit or forum discussions. Some people have gotten rather experienced at refuting this kind of thing sussinctly, and you may as well borrow from them.
[+] [-] sametmax|8 years ago|reply
And who decide what's a conspiracy ?
After all, we were regarded as paranoid lunatics wearing tinfoil hats about the NSA surveillance, until PRISM came out and it became obvious we weren't.
[+] [-] mirimir|8 years ago|reply
So is the Israeli attack on USS Liberty during the Six-Day War still a conspiracy theory? I guess not, because there's a Wikipedia page. But it was denied for many years.
[+] [-] euyyn|8 years ago|reply
[+] [-] chapill|8 years ago|reply
From the linked opinion piece in the story:
https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-po...
"YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales."
So the big question is, why not demonetize them? Removing all ads probably has a pretty significant impact on the algorithm already.
Why doesn't YouTube do the right thing? Stop making money on extremists? It seems like doing that would auto correct the problems.
[+] [-] zanny|8 years ago|reply
Well, Google. Its their website. They are going to do whatever they want with it.
[+] [-] dewey|8 years ago|reply
> Wojcicki did not specify how many conspiracies were on the company’s initial list, but suggested it would expand over time. “What I like about this unit is that it’s actually pretty extensible,” she said. The company could show additional alternate sources of information under controversial elements in the future, she said.
[+] [-] maxerickson|8 years ago|reply
People in general didn't care much more than they care about PRISM, but they didn't think it was unlikely.
[+] [-] stankypickle|8 years ago|reply
[+] [-] staunch|8 years ago|reply
YouTube also needs to fix search so that results are not full of computer-generated spam results. It's very surprising how poorly YouTube has been developed. The search is still embarrassingly bad after over a decade.
[+] [-] Viliam1234|8 years ago|reply
Wikipedia admins with most patience and energy to participate in edit wars. Wikipedia admins also decide who else becomes a new Wikipedia admin and who gets banned, so once a group of people with same opinions gets a majority among the Wikipedia admins, their opinion becomes fixed as the official "truth".
Automatically adding Wikipedia information to YouTube videos will only increase the importance of the edit wars.
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] flyingfences|8 years ago|reply
There are some absolute lunatics posting their content out there, but that will always be the case. I don't think it's a problem that really needs solving. Even if it was, I don't think that this is a solution. If you are too quick to counter a statement, people may suspect that you have ulterior motives. If videos come along with these "corrections" many viewers will see that as proof that they're right and that the "mainstream" is trying to cover something up.
A video is a creative production. The creator makes it in a certain way for a certain reason. When the publisher interjects, tries to mold and reshape the content, provides their own commentary with it, that restricts the creator's artistic freedom over their content.
I don't expect that this pseudo-censorship (that's what it is - they're saying "we don't think your content is right, so we'll be giving your viewers what we think is right) will be applied neutrally, either. There are nutcases in all ideological directions, but I have a solid suspicion that these annotations will only be applied to those that are not in line with the "politically correct" zeitgeist.
Of course, YouTube is well within their rights to do whatever they want with their platform. This isn't at all an uncharacteristic development for them, either. They've consistently demonstrated that their only interest is money; they don't care for the small creators - or really even the viewers - all that much. If this is what keeps the advertisers happy then it's only the logical step for them.
I just don't like it.
[+] [-] skybrian|8 years ago|reply
[+] [-] jacksmith21006|8 years ago|reply
Only interested in the money would be just remove the questionable content that advertisers have no interest being associated completely from YouTube.
Sounds like the opposite of only interested in the money?
Can you explain?
[+] [-] Jimmie_Rustle|8 years ago|reply
[deleted]
[+] [-] bArray|8 years ago|reply
1. Definition of what is a conspiracy and what is not. There are some hot topics out there all the time (e.g. Trump and Russia collusion) - pointing out and not pointing out "conspiracies" is a statement within itself. You only have to think about Twitter and it's "verified accounts" hot water not so long ago [1].
2. Adding fuel to the fire - for some conspiracies the idea of covering them up makes them more real for many people. There are many posts on various social media platforms with the tag line "share this before <X> takes it down" (e.g. Facebook videos [2]).
3. Videos that debunk theories could also get thrown into this automated mess they'll undoubtedly make and will probably incur demonetization. The result of that may be less debunking videos, actually poisoning the well further.
4. Generally squashing freedom of speech means that needed discussion on a particular topic can't happen.
What I think they should do:
1. Have this as an opt-out feature which is mostly hidden, or an opt-in feature which is obvious.
2. Pick controversial topics of any kind and find for and against cases for them, allowing the user to investigate them both. Picking a side is dangerous.
3. Have smart humans go through and carefully select for and against cases - don't try to automate this as it will be gamed.
4. Don't use this information for classifying whether videos should be monetized.
[1] https://www.businessinsider.com.au/twitter-clamps-down-on-ve...
[2] https://www.facebook.com/TaxationIsTheft2/videos/51317619238...
[+] [-] soneil|8 years ago|reply
The only people who take conspiracy videos seriously, are people who have already decided they don't like the truth, and either want someone to agree with them, or want someone to invent the details/logic so they don't have to.
You're not dealing with people who aren't aware of the moon landings. You're dealing with people who have already chosen, for whatever reason, not to believe it.
Have you ever won an argument about a moon hoaxer with "yes, they did", or against a flat-earther with "no, it's round"? Embedding wiki summaries into youtube is simply taking the argument that never, ever works, and automating it.
"You cannot reason someone out of a position they did not reason themselves into."
[+] [-] jMyles|8 years ago|reply
I also think that Wikipedia is a great resource for exactly this use case, especially where a small number of easily observable facts can overturn the thrust of the disinformation (think flat earth or anti-vax).
However, in some cases, it might not be as good a resource for this purpose. For example, here's the opening sentence in the article about Lee Harvey Oswald:
> Lee Harvey Oswald (October 18, 1939 – November 24, 1963) was an American former Marine and Marxist who assassinated United States President John F. Kennedy on November 22, 1963.
Of course Oswald enjoyed no trial, and the evidence against him is very flimsy. I (and I think many Americans who've studied the topic) don't think there's enough evidence against him to make such a brash declaration.
So, in this case (and I suspect, some others), Wikipedia serves to add bias rather than subtract it. I think that a sober, reasonable article on Oswald will offer a more nuanced and more factually-driven article.
I think it's especially important in these politically acrimonious times to investigate the Kennedy assassination, and the obvious conclusion from the evidence is that actors within our government executed a chief executive. Let's not misdirect people of a younger generation who try to study this topic.
[+] [-] cityofdelusion|8 years ago|reply
Google will clearly be able to influence bias depending on what article they link to.
[+] [-] efigle2501|8 years ago|reply
[+] [-] UenoHDTV80|8 years ago|reply
[+] [-] narrator|8 years ago|reply
There's a bit of doublethink here though. Does propaganda work? Well of course it doesn't because the truth will always come out because the news wants viewers and thus the official story is never propaganda! But, wait.. These conspiracy theory videos made by some random dude in his bedroom are convincing some people of things that are apparently not true! How did they do that if propaganda doesn't work? Explanations about Russian troll bots with magical mystery deception tactics and various handwaving ensues...
So does propaganda work? If not then we don't have to worry about fake news. If it does, then how do we know the real news isn't propaganda? What empowers propaganda more than anything? Lack of criticism.
[+] [-] colordrops|8 years ago|reply
[+] [-] tehwebguy|8 years ago|reply
[+] [-] EvangelistBilly|8 years ago|reply
If someone was trying to discredit say, Obama, and saw "Barrack Obama was born in the United States - according to Wikipedia" surely they'd rush there and try to sneak conspiracy theories past the moderators.
So what happens then, Wiki has to lock down more and more pages, or effectively only allow approved edits to topics people are trying to shill over.
That happens, and it's no longer the crowd-sourced/accepted truth, it's Wikipedia moderators truth.
[+] [-] cooper12|8 years ago|reply
* Pending changes protection: Anyone can edit, but edits must first be reviewed before becoming visible. Autoconfirmed [1] users have their edits go through right away, while other edits have to be approved by a pending changes reviewer. The criteria for reviewing edits basically boils down to "anything that is not vandalism" is reviewed and then it's treated like a normal edit that can be contested by other users. In effect it reduces the incentive to vandalize or make disruptive edits because they won't even be visible.
* Semi-protection: Only autoconfirmed users can edit. See [1]. This is used for pages where a large proportion of the edits are vandalism rather than helpful edits. 99% of vandalism stops here because most vandals are too lazy to get autoconfirmed or don't know how, since their goal isn't really to be productive.
* Extended confirmed protection: This is a fairly new option and requires an account 30 days old and with at least 500 edits. This is mainly used for pages under sanctions (namely the Israeli-Arab conflict), and in certain cases where autoconfirmed users are causing abuse.
* Full protection: Very rarely used. Mainly used in instances where multiples experienced editors are editing disruptively on a page such as by edit warring. The protection is always very short because it otherwise blocks editing from all users. In most cases though, the editors themselves are usually blocked or warned instead of a page necessitating full protection.
[0]: https://en.wikipedia.org/wiki/Wikipedia:Protection_policy [1]: Account must be four days old and have made at least 10 edits, so not hard at all to obtain
[+] [-] JumpCrisscross|8 years ago|reply
Wikipedia isn't ad-driven. That reduces its incentive to be addictive by any means necessary. It's also been arbitering truth fairly effectively for many years.
[+] [-] mratzloff|8 years ago|reply
[+] [-] wybiral|8 years ago|reply
If you try to squash conspiracy theories like this there will be those free-speech hardliners who see it as crossing a line.
If you allow the crazy conspiracy stuff then there will be people who complain about the platform being a toxic influence and promoting hate speech etc.
Personally I don't see anything wrong with attaching additional information like this as opposed to outright censorship or something. But my guess is that plenty of other people will complain.
Side note: I can't help but notice how polarizing issues involving free speech have become in recent years. Any topics involving perceived censorship by any host simply ignite.
[+] [-] zanny|8 years ago|reply
[+] [-] acct1771|8 years ago|reply
[+] [-] deepsand|8 years ago|reply
[+] [-] wyager|8 years ago|reply
[+] [-] harpiaharpyja|8 years ago|reply
[+] [-] harlanji|8 years ago|reply
Filling a void that I saw growing—we need open source systems on the level of the big guys; this is that. I am a disgruntled American being pushed out of jobs in SF and feel I am being censored on at least one platform and feel my service family member rolling in their graves recently.
[+] [-] sparkzilla|8 years ago|reply
https://newslines.org/blog/google-and-wikipedia-best-friends...
[+] [-] tomohawk|8 years ago|reply
[+] [-] XR0CSWV3h3kZWg|8 years ago|reply
Is that honestly the measure of how perfect a source of information is?
[+] [-] tehwebguy|8 years ago|reply
Of course people can’t source Wikipedia, they are meant to source the SOURCE articles! That’s like the entire point.
[+] [-] googlryas|8 years ago|reply
[+] [-] acct1771|8 years ago|reply
[+] [-] John_KZ|8 years ago|reply
Their preferred weapon though is de-ranking. This is the American model of censorship, and it really does work. Why bother going after people who speak up, when instead you can make sure nobody can hear them? You think the video is up, you can share it with your close friends, but nobody else will discover it. It's also super convenient for them because they move the majority of content in coldline storage almost right away. It's not just Youtube either, it's Google, Facebook, Twitter and all their subsidiaries. We need a massive change in the way the internet is structured, both technically and legally, if we want the open web back.
[+] [-] tehwebguy|8 years ago|reply
These are the funniest comments on HN. There seems to be a huge group of people that are more willing to believe that YouTube will push a narrative vs believing YouTube just wants to earn more profit.
If you make a video that people want to watch the algorithm will love you. It determines this by watch time and session time. If you make a video that people click away from, especially if it ends their watch session, the algorithm hates you. Remember, you have to compete against over 300 hours of content being uploaded every minute.
This is how every social platform works. It’s a function of content volume.
If they took down every video they didn’t like PewDiePie wouldn’t be the top channel, literally making fun of the platform almost every weekday. But most people on HN haven’t actually bothered watching him, I’d hazard to guess!
[+] [-] dreta|8 years ago|reply
IMHO it would be much more productive to tell people to be extremely sceptical towards everything they see on the internet. YouTube is very capable of launching a successful campaign to do that. I'm sure a lot of creators would jump on board, and, seeing what's most likely to go viral these days, it seems necessary. This is obviously in direct conflict with the business model of platforms like YouTube, Twitter, and Facebook, since they rely on ad ravenue, so we're stuck with orwellian solutions like the one presented.
[+] [-] tzahola|8 years ago|reply
[+] [-] stordoff|8 years ago|reply
[+] [-] tehwebguy|8 years ago|reply
[+] [-] ronilan|8 years ago|reply
It's all a conspiracy. https://en.wikipedia.org/wiki/Cats_and_the_Internet
[+] [-] tehwebguy|8 years ago|reply
YouTube isn’t censoring here, just making sure that the most despicable users cannot lead astray the most vulnerable minds without at least a warning label.
If you think that warning label is too influential you should consider the influence of well paid, highly watched liars on YouTube.
[+] [-] nitwit005|8 years ago|reply
If you want to, say, convince someone that is watching Holocaust denial videos that it did happen, I'd actually dig into some past reddit or forum discussions. Some people have gotten rather experienced at refuting this kind of thing sussinctly, and you may as well borrow from them.