> Gurb turned me on to a kick-ass book called "The Mysterious Case Of Rudolph Diesel," and I think you should read it if you're interested at all in the world, but you should buy it with cash in a town you don't live in and read it in a dimly lit cavern. Because if you don't, if The System finds out you read a book about a fascinating historical character and his mysterious disappearance, you'll be clocked immediately by their tendrils as… whoever this is.
I enjoy the comic and just about everything else everyone at PA develops, but it’s Holkin’s thrice weekly displays of absolute word sorcery that has kept me reading since 1998.
This is not at all surprising. Make a brand new account and watch a few welding or machining videos. You'll be getting PragerU, Daily Wire and Tucker clips in no time. It goes downhill quickly from there. The targeting is pretty explicit.
"Targeting" isn't the right word, "statistics" is. These are statistics based, data driven, mechanisms that do their best to find things that give the best chance of you positive interacting with them, with positive meaning money going towards the people behind the algorithm.
Daily Wire isn't particularly 'extreme' or 'conspiracy theorist'. I am less familiar with Tucker Carlson but while I find his takes on Russia distasteful, I haven't seen any extremism.
In some ways I think this is a tricky problem since you want users to get deeper into some topics, but not ones that are considered "problematic", but defining those is inherently political.
It seems like you could define some idea of "depth" into a topic (based on how far out of normal viewer's patterns it is), and only generate recommendations for items that aren't far outside of the norm, but this would lead to a lack of depth for recommendations in niches.
Maybe a middle ground would be to treat sensitive topics differently in terms of "vertical" recommendations, by e.g., explicitly marking some categories as safe and enabling recommendations to go deeper, but only allowing "horizontal" recommendations for unknown topics, and maybe preventing recommendations "into" that topic from the outside.
So... if you're watching train videos you might get to see even more niche ones, but welding won't recommend for you fox news, and watching fox news won't show you Alex Jones recommendations.
I pick on the right here since it's in the topic (and I'm left leaning myself), but I think radicalization is an issue on the left as well (though frankly my political opinions make me think it is less impactful there, mostly because of the way people radicalize on the left I believe tends to impact less marginalized people or be in terms of policy rather than affecting people that are already beaten down).
While left leaning users are only presented with a healthy selection of diverse and well argumented videos expressing a panel of perfectly reasonable view points.
This seems a bit suspicious because there is a trend of defining right wing content as extremist. And I'm not interested in whether something is classified as a conspiricy so much as whether it is true.
I'[ll pick on Ivermectin through COVID as an interesting case. Now, obviously, if you have 2 groups and one has parasites but the other doesn't then the parasite-free group will get better COVID results. So as expected, people treated with Ivermectin got better COVID outcomes.
It took a long time to get the message out to explain that effect because in the spheres I listened to everyone who pointed out the statistically significant result got shut down with logical fallacies. Conspiracy theorist was definitely one.
I'd rather be completely correct, but I'm happy to fall for the occasional conspiracy that is backed by statistically significantly evidence. People who fall for that sort of mistake are going to get better results long term than people who ignore evidence. But this study would classify that sort of evidence-based reasoning as a right-winger being led into extremist conspiracy content. I mean, I dunno. A branch of the right wing believes in looking at primary evidence. That means they get things wrong, and sometimes right, in ways out of sync with the mainstream conversation.
The whole takeaway of the study is that the effect is significantly more pronounced on the right. More categorical details are in the supplementary material.
The journey to the dark side starts with some interesting Joe Rogan video, that takes you to Jordan Peterson, then Matt Walsh and ends with Stefan Molyneux, Lauren Southern and Alex Jones. That is the bottom of the YouTube iceberg. Below that point videos get mass reported and taken down.
I'd love to know what the "left wing" equivalents to the above are. Looooots of comments in here claiming the "same thing" happens with left-wing content, but I'm not aware of extremely popular left wing outlets that lie about literally everything they report on the way Alex Jones, PragerU, and the like do.
This is true for most algorithms in user created content sites. Also, this isn't exclusive to right-leaning, it's the same for left-leaning, and other types of content all the same. It's just how algorithms work.
The real question should be, should we prevent this type of content from getting recommended, and where are the lines?
Has a side note, I'd love to see a Twitter style Community Notes be implemented on YT. It's the one good feature Twitter has implemented in a long while. And yes, YT has Notes, but they're done by YT themselves (the COVID ones for example).
I'd genuinely like to see what moderate right leaning content is even available for consumption. The only thing anyone seems to talk about anymore is the grifters and lunatics.
I suspect that the overall premise of the paper is correct, but it's interesting that they repeatedly reference lists of what they call "problematic" right wing categories such as “IDW,” “Alt-right,” “Alt-lite”, “AntiSJW”, “Conspiracy”, “MRA”, “ReligiousConservative”, “QAnon”, and “WhiteIdentitarian” while they seem to only recognize a single category as extremist left content: the "Socialist" category.
If you're specifically looking out for a long list of right wing extremist content categories, but only one category of left wing extremist content is there any wonder that you'd find that youtube pushes people to extremist right wing stuff to a greater extent than they do the extremely limited left wing extremist content being considered?
The study considers the following "Very Left" [1]:
- MSNBC
- Senator Bernie Sanders
- Elizabeth Warren
- Vox
I mean, I suppose it is understandable if your political experience is solely American. But I do wonder if one considers these "very left", what will happen if they come across political concepts such as Anarchism? If they read Malatesta's writings, for example, would their minds just explode?
> In this study, the research team defined problematic channels as those that shared videos promoting extremist ideas, such as white nationalism, the alt-right, QAnon and other conspiracy theories. More than 36% of all sock puppet users in the experiment received video recommendations from problematic channels. For centrist and left-leaning users, that number was 32%. For the most right-leaning accounts, that number was 40%.
They defined problematic channels as anything specifically espousing far-right wing ideas, and found that ring-wing users were only-slightly more likely to be recommended content from them.
It's kind of disappointing they couldn't find something problematic or conspiratorial from the left, even just for the sake of comparison.
> For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content. Recommendations for left-leaning users on YouTube were markedly fewer, researchers said.
This depends on the researcher's definitions of 'extremism' and 'conspiracy theories'.
- Recently we've seen many left wing people state that disassembling people in front of their families - surely an 'extreme' act - is a 'beautiful act of resistance', and that calls for genocide against Jewish people (surely also 'extreme') may not constitute hate speech in some contexts.
- For the last 7 years we've had many people believe in the Russiagate conspiracy theory.
- I'm not sure "problematic" has any real meaning.
wlesieutre|2 years ago
https://www.penny-arcade.com/comic/2023/12/01/algo-rhythms
> Gurb turned me on to a kick-ass book called "The Mysterious Case Of Rudolph Diesel," and I think you should read it if you're interested at all in the world, but you should buy it with cash in a town you don't live in and read it in a dimly lit cavern. Because if you don't, if The System finds out you read a book about a fascinating historical character and his mysterious disappearance, you'll be clocked immediately by their tendrils as… whoever this is.
impegh|2 years ago
twisteriffic|2 years ago
nomel|2 years ago
"Targeting" isn't the right word, "statistics" is. These are statistics based, data driven, mechanisms that do their best to find things that give the best chance of you positive interacting with them, with positive meaning money going towards the people behind the algorithm.
nailer|2 years ago
foota|2 years ago
It seems like you could define some idea of "depth" into a topic (based on how far out of normal viewer's patterns it is), and only generate recommendations for items that aren't far outside of the norm, but this would lead to a lack of depth for recommendations in niches.
Maybe a middle ground would be to treat sensitive topics differently in terms of "vertical" recommendations, by e.g., explicitly marking some categories as safe and enabling recommendations to go deeper, but only allowing "horizontal" recommendations for unknown topics, and maybe preventing recommendations "into" that topic from the outside.
So... if you're watching train videos you might get to see even more niche ones, but welding won't recommend for you fox news, and watching fox news won't show you Alex Jones recommendations.
I pick on the right here since it's in the topic (and I'm left leaning myself), but I think radicalization is an issue on the left as well (though frankly my political opinions make me think it is less impactful there, mostly because of the way people radicalize on the left I believe tends to impact less marginalized people or be in terms of policy rather than affecting people that are already beaten down).
dudul|2 years ago
mcphage|2 years ago
That’s what left-wing extremism is.
roenxi|2 years ago
I'[ll pick on Ivermectin through COVID as an interesting case. Now, obviously, if you have 2 groups and one has parasites but the other doesn't then the parasite-free group will get better COVID results. So as expected, people treated with Ivermectin got better COVID outcomes.
It took a long time to get the message out to explain that effect because in the spheres I listened to everyone who pointed out the statistically significant result got shut down with logical fallacies. Conspiracy theorist was definitely one.
I'd rather be completely correct, but I'm happy to fall for the occasional conspiracy that is backed by statistically significantly evidence. People who fall for that sort of mistake are going to get better results long term than people who ignore evidence. But this study would classify that sort of evidence-based reasoning as a right-winger being led into extremist conspiracy content. I mean, I dunno. A branch of the right wing believes in looking at primary evidence. That means they get things wrong, and sometimes right, in ways out of sync with the mainstream conversation.
dudul|2 years ago
turing_complete|2 years ago
anigbrowl|2 years ago
thr_cust|2 years ago
edvinbesic|2 years ago
bryanlarsen|2 years ago
hakdbha|2 years ago
[deleted]
Ekaros|2 years ago
As I am absolutely sure it also leads to extreme left wing content.
foota|2 years ago
malshe|2 years ago
29athrowaway|2 years ago
seattle_spring|2 years ago
unknown|2 years ago
[deleted]
lemoncookiechip|2 years ago
The real question should be, should we prevent this type of content from getting recommended, and where are the lines?
Has a side note, I'd love to see a Twitter style Community Notes be implemented on YT. It's the one good feature Twitter has implemented in a long while. And yes, YT has Notes, but they're done by YT themselves (the COVID ones for example).
thelastknowngod|2 years ago
xboxnolifes|2 years ago
autoexec|2 years ago
If you're specifically looking out for a long list of right wing extremist content categories, but only one category of left wing extremist content is there any wonder that you'd find that youtube pushes people to extremist right wing stuff to a greater extent than they do the extremely limited left wing extremist content being considered?
thiago_fm|2 years ago
butterNaN|2 years ago
- MSNBC
- Senator Bernie Sanders
- Elizabeth Warren
- Vox
I mean, I suppose it is understandable if your political experience is solely American. But I do wonder if one considers these "very left", what will happen if they come across political concepts such as Anarchism? If they read Malatesta's writings, for example, would their minds just explode?
[1]: https://www.pnas.org/doi/10.1073/pnas.2213020120#supplementa...
legitster|2 years ago
They defined problematic channels as anything specifically espousing far-right wing ideas, and found that ring-wing users were only-slightly more likely to be recommended content from them.
It's kind of disappointing they couldn't find something problematic or conspiratorial from the left, even just for the sake of comparison.
realjhol|2 years ago
nailer|2 years ago
This depends on the researcher's definitions of 'extremism' and 'conspiracy theories'.
- Recently we've seen many left wing people state that disassembling people in front of their families - surely an 'extreme' act - is a 'beautiful act of resistance', and that calls for genocide against Jewish people (surely also 'extreme') may not constitute hate speech in some contexts.
- For the last 7 years we've had many people believe in the Russiagate conspiracy theory.
- I'm not sure "problematic" has any real meaning.
monsecchris|2 years ago