(no title)
dentemple | 5 months ago
I (40m) don't think I've ever seen literal flashing or literal porn on TikTok, and my algorithm does like to throw in thirst content between my usual hobby stuff.
Are they making the claim that showing porn is a normal behavior for TikTok's algorithm overall, or are they saying that this is something that specifically pervasive with child accounts?
mothballed|5 months ago
netruk44|5 months ago
Approximate location, age, mobile OS/browser, your contacts, which TikTok links you open, who generated the links you open, TikTok search history, how long it takes you to swipe to the next video on the for you page, etc.
I don’t think it's really possible to say what TikTok’s algorithm does “naturally”. There’s so many influencing factors to it. (Beyond the promoted posts and ads which people pay TikTok to put in your face)
If you sign up to TikTok with an Android and tell it you’re 16, you’re gonna get recommended what the other 16 year olds with Androids in your nearby area (based on IP address) are watching.
gadders|5 months ago
It might be because I always block anyone with an OF link in their bio, but then that policy doesn't work on Insta.
ivape|4 months ago
We’re a derelict society that has become numb, “it’s just a thirst trap”.
We’re in the later innings of a hyper-sexualized society.
Why it’s bad:
1) You shift male puberty into overdrive
2) You continue warping young female concepts of lewdness and body image, effectively “undefining” it (lewdness? What is lewdness?).
3) You also continue warping male concepts of body image
mvieira38|5 months ago
causal|5 months ago
saurik|4 months ago
fsckboy|4 months ago
the latter is what they tested, but they didn't say specifically pervasive.
you quote the article so it seems like you looked at it, but questions you are curious/skeptical about are things they talked about in the opening paragraphs. it's fine to be skeptical, but they explain their methodology and it is different than the experience you are relying on:
>Global Witness set up fake accounts using a 13-year-old’s birth date and turned on the video app’s “restricted mode”, which limits exposure to “sexually suggestive” content.
>Researchers found TikTok suggested sexualised and explicit search terms to seven test accounts that were created on clean phones with no search history.
>The terms suggested under the “you may like” feature included “very very rude skimpy outfits” and “very rude babes” – and then escalated to terms such as “hardcore pawn [sic] clips”. For three of the accounts the sexualised searches were suggested immediately.*
>After a “small number of clicks” the researchers encountered pornographic content ranging from women flashing to penetrative sex. Global Witness said the content attempted to evade moderation, usually by showing the clip within an innocuous picture or video. For one account the process took two clicks after logging on: one click on the search bar and then one on the suggested search.
NoGravitas|4 months ago
Hizonner|4 months ago
They are in the business of whipping up outrage, and should not be given any oxygen.
lupusreal|4 months ago
Clicking on thirst trap videos?
thehodge|5 months ago
elevation|4 months ago
I assume that the offending content was popular but hadn’t been flagged yet and that the algorithm was just measuring her interest in a trending theme; it seems like it would be bad for business to intentionally run off mainstream users like that.
yapyap|5 months ago
ChromaticPanic|4 months ago
IanCal|5 months ago
InitialLastName|4 months ago
Kenji|5 months ago
[deleted]