Frankly, I hope YouTube (and Google in general) dramatically overdo it with the censorship, "curation", and deplatforming of "irresponsible" content, purely so that it forces an independent, non-corporate model to be developed.
Having one company be the sole gatekeeper for the overwhelming majority of the web's video content is not a solution.
> Frankly, I hope YouTube (and Google in general) dramatically overdo it with the censorship, "curation", and deplatforming of "irresponsible" content, purely so that it forces an independent, non-corporate model to be developed.
Just like people got what they wished for when Voat was spawned from Reddit?
I don't think any new platform is going to be able to compete with YouTube at all with the current regulations. How will any new platform comply with new EU copyright laws, or remove unlawful videos as quickly as YouTube? (e.g the unfortunate New Zealand terrorist attack)
>Having one company be the sole gatekeeper for the overwhelming majority of the web's video content is not a solution.
I struggle with this line of thinking because it's not specific enough. YouTube doesn't act as a gatekeeper for most video content. You can upload almost anything you want as long as its not grotesque or copyrighted. What most people mean when they say YouTube acts as a gatekeeper is "will YouTube pay me for the content I upload". And the problem with that is the advertisers, not YouTube - a handful of large advertisers decided they didn't like a lot of the content on YouTube and refused to pay for it. The beef is with the advertisers, not YouTube.
I do have a problem with the videos YouTube decides to promote and surface, but when it becomes to the topic and censorship and deplatforming, the argument almost always boils down to "Proctor and Gamble need to pay me to upload content whether they want to or not", which I find asinine. YouTube is obligated to provide you with a sales team.
It's starting to look like we're getting more of a regulatory moat. (In the Warren Buffet sense of "moat", as barrier to competitors.)
An additional concern, at least in US tradition, is when once-again-centralized monopoly venues are being mandated (or given permission?) by politicians to censor or "curate" everyone's speech. I'm pretty sure I was taught in school that we consider that a bad idea here.
I generally favor a healthy amount of decentralization, that the remaining big guys be more common carrier (and stop claiming rights to people's communications), and a renewed individual sense of responsibility towards society (not gaming social media, playing to metrics, manipulating our lesser selves, etc.).
> Having one company be the sole gatekeeper for the overwhelming majority of the web's video content is not a solution.
Anecdotal experience: Most people discover most videos they watch through hyperlinks/word of mouth, not YT recommendations, and nothing is stopping you from self-hosting a video.
YT is not like a social network (which become quadratically more valuable, and exclusive, with their number of users). There's no walled garden, its consumed passively, and its competition is, in fact, a click away.
the accelerationist model of web centralisation. It depends on how much of the populace actually cares, or if YouTube draws most of its revenue from casual viewers watching trending videos that are not likely to fall foul of these practices.
A site like youtube requires so much CPU/GPU power, storage capacity and bandwidth that it's highly unlikely that a new competitor[] will appear, and completely impossible to decentralise.
[]If we don't count the likes of Facebook, Microsoft, etc as competitors, since they would manage the site in the same way Google does.
The Trending page has been listing the same people over and over and over again for the last 3 years. It's shameful, pathetic, and makes the platform feel extremely biased.
Seriously though. This is unacceptable and overly controlled. Not to mention that those people over the last 3 years post borderline brainwash content without any connection to a real audience.
It's pretty sad to see YouTube turn into something like this.
These companies focusing so much on algorithms, when the best solution is to simply use actual humans.
I've done a fair amount of work on trying to detect and deal with spammers, fraudulent activity and such. Nothing beats a human eye.
My most successful approach has been to filter out the super obvious stuff, then send the rest to a queue that is kept an eye on by humans. A small staff can easily fight back against bullshit. Yes, YouTube is orders of magnitude above what I've dealt with, but I'm sure their smart engineers can filter out the majority of stuff.
It's painfully obvious that YouTube lacks human eyes on things. That, or they lack direction and empowerment of their employees.
I like the intention, and I hope to see more of this... but it's kind of hard to imagine that the end goal here is "Responsibility", when the business goal is "how many ads can we sell?"
Valid concerns. What does Google gain from putting genuine effort into stopping the spread of these kinds of video? Perhaps more "responsible" content and an eventual turn-around of the platform... But that's a big gamble - may take years, may never happen.
In the meantime, they're bleeding man hours, money, ad impressions, perhaps advertisers targeting the audiences which are being culled.
Now examine the question: What does Google have to gain by appearing to put a genuine effort into this while not actually addressing the problem?
They gain a whole lot more than in the first scenario - new users hopeful of an increase in quality, support from privacy and attention-span advocates, insight into how they can influence, suppress, or promote certain themes going forward, more metrics to become more entrenched in the data harvesting machine learning game...
I wonder if they could look at groups of users who up vote "bad" videos (as marked by their own internal reviewers), see what else they up vote, cluster those videos, and down promote them in their algorithm.
IOW use the users who like "bad" videos to help find other "bad" videos via an algorithm.
This really only works in a low volume situation. There's about 500,000 hours of video uploaded to Youtube every day, so employees would only be able to actually watch an insignificant fraction of that per day. Additionally, they're unlikely qualified to judge whether a video is actually good or bad. They can judge the sound quality, or videography, but unless you're well versed in the field the video is about, it's pretty much impossible to determine if a video is actually offering novel or good information.
Additionally, the bots upvoting content are going to have several thousands, possibly hundreds of thousands of bots. So detecting the circle is going to be very difficult. Even more so if you consider a lot of legitimate users will have the same taste and watch a lot of the same videos.
Quality content takes time to create; many of the past changes in YouTube algorithms explicitly pushed for churning out low-quality content frequently and rewarded it, while seriously disadvantaging irregular quality content (that moved elsewhere in the meantime). You get what you reward.
This can be seen easily if you go to Youtube and search "fortnite", with every single top video being just over 6 minutes long, full of vapid nothingness related to the title, and perhaps 20-30 seconds of content related to the title it originally baited you into watching. The majority of what can be found through searching/filtering on YouTube is also hot steamy garbage, or large corporate channels. The recommendation engine sometimes throws in some gems, but rarely. The only video length filters are "short <4mins" and "long >20mins". Why on earth? It's like they randomly tried those values in 2008 and never looked at it again.
I noticed this a long time ago. Content creators were being rewarded for "daily" videos. So many of the channels I loved had videos where they came out and said "I'm going daily and, yes, the videos are going to suck until I daily upload good stuff." A few channels I watched just disappeared because of the daily upload reward/promote requirement. Algos don't work with creative works and human tastes.
Not just YouTube algorithms either. Almost all social media site algorithms and search engine ones (like say, Google's) seem to put a heavy emphasis on regular content rather than quality content.
And it's the main cause behind both journalism going through a massive decline in quality and so much low effort content showing up everywhere online.
AFAIK you can do that? Hit the menu on a video recommendation, "Not Interested" -> "Tell Us Why" -> "I'm not interested in this channel".
IMO, though, blacklisting has been mostly proven to be insufficient. If you block, say, noted chud Sargon of Akkad, that doesn't mean that another half-dozen "Rationalist" "Thinkers" Who Just Happen To Be Chuds won't be recommended to you as well.
Video Blocker extension for chromium and firefox keeps me sane on YouTube. It is surprisingly quick to block the main offenders and have actually decent recommendations on the sidebar.
Honestly, it's pretty ridiculous that YouTube is being pressured to police the content on its platform. What worries me far more than extremist videos on YouTube, is the idea of a mega-corporation exerting outsized influence over the opinions we are exposed to. Is our democracy really so fragile that it cannot survive an open marketplace of ideas?
I think it's fairly reasonable that anybody who runs a recommendation service is held responsible for the content they recommend. YouTube isn't just a video hosting platform, it's a homepage providing recommendations, a service that automatically queues up a new video after the one you're currently watching, and an advertising network that directly funds the creation of the videos they host. YouTube is already putting their thumb on the scale - the least they can do is be responsible about what they recommend.
It's been pretty clear over the last couple years that our democracy really is so fragile that it can't survive one of the dominant media companies of the day actively promoting extremist and hateful content.
It's not an open marketplace and each side is is fighting a different fight:
>Never believe that anti-Semites are completely unaware of the absurdity of their replies. They know that their remarks are frivolous, open to challenge. But they are amusing themselves, for it is their adversary who is obliged to use words responsibly, since he believes in words. The anti-Semites have the right to play. They even like to play with discourse for, by giving ridiculous reasons, they discredit the seriousness of their interlocutors. They delight in acting in bad faith, since they seek not to persuade by sound argument but to intimidate and disconcert.
whatever gets them to stop recommending me BEN SHAPIRO [EVISCERATES / DEMOLISHES / LEAVES SPEECHLESS] LIBERAL [PROFESSOR / SMART PERSON / SPEAKER] OVER [SOMETHING BEN IS CLEARLY WRONG / TROLLING ABOUT] videos.
I think when critics accuse YouTube of "stepping on the scale" here, they under-consider how warped that scale is already. The existing YouTube algorithm isn't neutral or fair, and is laughably easy to game. It's already biased—it's just biased towards extremes.
YouTube waded into this moral quagmire a long time ago. Being biased towards "engagement" is just as troubling as having a more human-relatable bias.
Now they're starting to take responsibility for the recommendations this algorithm spits out. I think that's a good thing.
This seems like a re-implementation of television ratings and survey (Nielsen) methods. In this way, it's interesting how strategies on the commercial internet gravitate toward business models with a century of momentum (and student loans) behind them. Business chips away at the revolution.
> The video service generates most of its revenue through advertising and the business works best when as many people as possible are spending as much time as possible on YouTube. Hence executives’ obsession with engagement stats. Adding murkier metrics to the mix could crimp ad revenue growth.
This is the core point of it.
I've been critical of youtube for some time, but if they are willing to forgo some revenue growth in to deal with the current problems, I applaud them.
> The YouTube channel Red Ice TV broadcasts political videos with a “pro-European perspective,” which critics label as promoting white supremacy.
Bloomberg, you can call them Nazis. This is not a racist uncle that everyone needs to deal with, who can sometimes be reasoned with. These people are literally Nazis.
> YouTube declined to share details on how it uses metrics to rank and recommend videos. In a January blog post, the company said it was hiring human reviewers who would train its software based on guidelines that Google’s search business has used for years.
It's going to be tricky to get this to work, and I'm sure google will make many mistakes by automating decision making to a degree where it will prove to be meaningless, but finally there's going to be some human moderation. Thank you.
Well the reality is that you don't get quality content generally when you allow people to publish whatever they want. Everyone has an opinion and viewpoint. That doesn't make them qualified opinions or viewpoints, and definitely doesn't make them worth considering. People have significantly mistaken and waaay over estimated the benefits of mass media in the hands of your everyday person.
It's becoming pretty clear the cost is not worth any perceived benefits (because I don't think you can really point to any benefits such models have given us).
And no, moderating and editorializing content is not dystopian. Whenever someone has that talking point it's really a case of someone read 1984 (and sometimes didn't actually read it) and decided to apply it to everything with absolutely no discretion and absolutely no respect for what the book actually said.
>Creating the right metric for success could help marginalize videos that are inappropriate, or popular among small but active communities with extreme views.
So, deplatform what's not mainstream, accepted by the establishment, favorable to Google...
I don't think that deplatform is the right idea here. Presumably, the videos will still be present and watchable. However, the algorithms used to sort and present videos must be seen as playing an editorial role, and that brings a different set of questions.
I would actually be glad if some of the more extreme and irresponsible opinions are deplatformed. It cant be argued that Youtube is expected to platform all range of opinions.
There is however a strong likelihood of this being exploited by Youtube for its own purposes.
What a dystopian nightmare the internet is turning into. Anything that isn't in line with "right thought" isn't Responsible and is therefore hidden away? How did we end up here? How did Silicon Valley go from a place focused on ideas and building the future to generating systematic censorship and endless puritanical moral panic? We badly need some new blood guiding tech development.
It this point, they should just be regulated by the various world governments.
Though I don't like a world where google can override a given countries law, or googles tries to solve the issue for union of all the worlds government laws.
I also don't like the possibilites of hidden metrics driving things that have such huge social implications.
Exactly. If things keep going at this rate Google and Facebook will be shaping and distorting reality for a large portion of the population.
If 95% of your internet traffic goes through a search box or algorithm... And you spend 6 hours a day on the internet... And you're not savvy to these issues (as our readers are)... That's almost 50% of waking life spent consuming manufactured data which your brain (on some level) interprets as "real" without question. Terrifying.
I think 2019 is when this attitude started to get more mainstream within tech. I’m not opposed to it but I hope people understand that following local laws means that companies may end up enforcing laws from other cultures.
On a side note, the creators of YouTube ought to [at least informally] unionize. There's no reason Adam Neely should be getting copyright "strikes" for explaining music to the masses, or Cody's Lab should be forced to take down videos on mining with explosives [on his family's ranch]. YouTube needs to have a hotline for problem solving for their most profitable content providers, and until the creators collectively hit the pocket book, they'll face these disruptive yet easily solvable problems.
[+] [-] keiferski|7 years ago|reply
Having one company be the sole gatekeeper for the overwhelming majority of the web's video content is not a solution.
[+] [-] seattle_spring|7 years ago|reply
Just like people got what they wished for when Voat was spawned from Reddit?
[+] [-] enitihas|7 years ago|reply
[+] [-] nemothekid|7 years ago|reply
I struggle with this line of thinking because it's not specific enough. YouTube doesn't act as a gatekeeper for most video content. You can upload almost anything you want as long as its not grotesque or copyrighted. What most people mean when they say YouTube acts as a gatekeeper is "will YouTube pay me for the content I upload". And the problem with that is the advertisers, not YouTube - a handful of large advertisers decided they didn't like a lot of the content on YouTube and refused to pay for it. The beef is with the advertisers, not YouTube.
I do have a problem with the videos YouTube decides to promote and surface, but when it becomes to the topic and censorship and deplatforming, the argument almost always boils down to "Proctor and Gamble need to pay me to upload content whether they want to or not", which I find asinine. YouTube is obligated to provide you with a sales team.
[+] [-] tracker1|7 years ago|reply
[+] [-] neilv|7 years ago|reply
An additional concern, at least in US tradition, is when once-again-centralized monopoly venues are being mandated (or given permission?) by politicians to censor or "curate" everyone's speech. I'm pretty sure I was taught in school that we consider that a bad idea here.
I generally favor a healthy amount of decentralization, that the remaining big guys be more common carrier (and stop claiming rights to people's communications), and a renewed individual sense of responsibility towards society (not gaming social media, playing to metrics, manipulating our lesser selves, etc.).
[+] [-] sametmax|7 years ago|reply
[+] [-] vkou|7 years ago|reply
Anecdotal experience: Most people discover most videos they watch through hyperlinks/word of mouth, not YT recommendations, and nothing is stopping you from self-hosting a video.
YT is not like a social network (which become quadratically more valuable, and exclusive, with their number of users). There's no walled garden, its consumed passively, and its competition is, in fact, a click away.
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] sschueller|7 years ago|reply
[+] [-] beaconstudios|7 years ago|reply
[+] [-] kzcqt|7 years ago|reply
[]If we don't count the likes of Facebook, Microsoft, etc as competitors, since they would manage the site in the same way Google does.
[+] [-] skilled|7 years ago|reply
Seriously though. This is unacceptable and overly controlled. Not to mention that those people over the last 3 years post borderline brainwash content without any connection to a real audience.
It's pretty sad to see YouTube turn into something like this.
[+] [-] legohead|7 years ago|reply
I've done a fair amount of work on trying to detect and deal with spammers, fraudulent activity and such. Nothing beats a human eye.
My most successful approach has been to filter out the super obvious stuff, then send the rest to a queue that is kept an eye on by humans. A small staff can easily fight back against bullshit. Yes, YouTube is orders of magnitude above what I've dealt with, but I'm sure their smart engineers can filter out the majority of stuff.
It's painfully obvious that YouTube lacks human eyes on things. That, or they lack direction and empowerment of their employees.
[+] [-] krisrm|7 years ago|reply
[+] [-] Liquix|7 years ago|reply
In the meantime, they're bleeding man hours, money, ad impressions, perhaps advertisers targeting the audiences which are being culled.
Now examine the question: What does Google have to gain by appearing to put a genuine effort into this while not actually addressing the problem?
They gain a whole lot more than in the first scenario - new users hopeful of an increase in quality, support from privacy and attention-span advocates, insight into how they can influence, suppress, or promote certain themes going forward, more metrics to become more entrenched in the data harvesting machine learning game...
[+] [-] jatsign|7 years ago|reply
IOW use the users who like "bad" videos to help find other "bad" videos via an algorithm.
[+] [-] twanvl|7 years ago|reply
[+] [-] gmiller123456|7 years ago|reply
Additionally, the bots upvoting content are going to have several thousands, possibly hundreds of thousands of bots. So detecting the circle is going to be very difficult. Even more so if you consider a lot of legitimate users will have the same taste and watch a lot of the same videos.
[+] [-] bitL|7 years ago|reply
[+] [-] degenerate|7 years ago|reply
[+] [-] tmm84|7 years ago|reply
[+] [-] CM30|7 years ago|reply
And it's the main cause behind both journalism going through a massive decline in quality and so much low effort content showing up everywhere online.
[+] [-] bitxbit|7 years ago|reply
[+] [-] eropple|7 years ago|reply
IMO, though, blacklisting has been mostly proven to be insufficient. If you block, say, noted chud Sargon of Akkad, that doesn't mean that another half-dozen "Rationalist" "Thinkers" Who Just Happen To Be Chuds won't be recommended to you as well.
[+] [-] bittercynic|7 years ago|reply
[+] [-] gloflo|7 years ago|reply
[+] [-] duxup|7 years ago|reply
Google and YouTube want their AI or algorithms pointed AT you, not working for you.
[+] [-] noego|7 years ago|reply
[+] [-] CharlesW|7 years ago|reply
It's not "ridiculous", it's table stakes. If you ever run an open community, you'll find that bad actors are inevitable.
[+] [-] notatoad|7 years ago|reply
It's been pretty clear over the last couple years that our democracy really is so fragile that it can't survive one of the dominant media companies of the day actively promoting extremist and hateful content.
[+] [-] minikites|7 years ago|reply
>Never believe that anti-Semites are completely unaware of the absurdity of their replies. They know that their remarks are frivolous, open to challenge. But they are amusing themselves, for it is their adversary who is obliged to use words responsibly, since he believes in words. The anti-Semites have the right to play. They even like to play with discourse for, by giving ridiculous reasons, they discredit the seriousness of their interlocutors. They delight in acting in bad faith, since they seek not to persuade by sound argument but to intimidate and disconcert.
[+] [-] malloreon|7 years ago|reply
[+] [-] bobbygoodlatte|7 years ago|reply
YouTube waded into this moral quagmire a long time ago. Being biased towards "engagement" is just as troubling as having a more human-relatable bias.
Now they're starting to take responsibility for the recommendations this algorithm spits out. I think that's a good thing.
[+] [-] a012|7 years ago|reply
[+] [-] driverdan|7 years ago|reply
1. Lower rank for clickbait titles
2. Lower rank for listicle / compilation / clip videos
3. Delist listicle / compilation videos with stolen content (most of them)
4. Lower rank for clickbait thumbnails
5. Lower rank for YT drama videos
6. Increased rank for 100% original videos
[+] [-] dredmorbius|7 years ago|reply
If I get recommendations from a channel producing shit, I should be able to block the entire channel.
Sufficient good-faith blocks dock that channel's recommendation weighting.
The fact this isn't available even when logged inremoves all incentive I have to log in to YT.
(Which is already near nil.)
[+] [-] eyeareque|7 years ago|reply
[+] [-] rhizome|7 years ago|reply
[+] [-] prolepunk|7 years ago|reply
This is the core point of it. I've been critical of youtube for some time, but if they are willing to forgo some revenue growth in to deal with the current problems, I applaud them.
> The YouTube channel Red Ice TV broadcasts political videos with a “pro-European perspective,” which critics label as promoting white supremacy.
Bloomberg, you can call them Nazis. This is not a racist uncle that everyone needs to deal with, who can sometimes be reasoned with. These people are literally Nazis.
> YouTube declined to share details on how it uses metrics to rank and recommend videos. In a January blog post, the company said it was hiring human reviewers who would train its software based on guidelines that Google’s search business has used for years.
It's going to be tricky to get this to work, and I'm sure google will make many mistakes by automating decision making to a degree where it will prove to be meaningless, but finally there's going to be some human moderation. Thank you.
[+] [-] kadendogthing|7 years ago|reply
It's becoming pretty clear the cost is not worth any perceived benefits (because I don't think you can really point to any benefits such models have given us).
And no, moderating and editorializing content is not dystopian. Whenever someone has that talking point it's really a case of someone read 1984 (and sometimes didn't actually read it) and decided to apply it to everything with absolutely no discretion and absolutely no respect for what the book actually said.
[+] [-] coldtea|7 years ago|reply
So, deplatform what's not mainstream, accepted by the establishment, favorable to Google...
[+] [-] rwj|7 years ago|reply
[+] [-] nafey|7 years ago|reply
There is however a strong likelihood of this being exploited by Youtube for its own purposes.
[+] [-] AlexB138|7 years ago|reply
[+] [-] monkeybrainsrus|7 years ago|reply
[+] [-] Liquix|7 years ago|reply
If 95% of your internet traffic goes through a search box or algorithm... And you spend 6 hours a day on the internet... And you're not savvy to these issues (as our readers are)... That's almost 50% of waking life spent consuming manufactured data which your brain (on some level) interprets as "real" without question. Terrifying.
[+] [-] chillacy|7 years ago|reply
[+] [-] rc_kas|7 years ago|reply
[+] [-] exabrial|7 years ago|reply