top | item 40741160

Instagram Recommends Sexual Videos to Accounts for 13-Year-Olds, Tests Show

96 points| Umofomia | 1 year ago |wsj.com

98 comments

order

Zelphyr|1 year ago

I know this sounds defeatist but, I don't know why anyone even bothers to report things like this anymore. Meta doing something they're not supposed to do, even after promising they'd never do it again? This isn't news. This is a Thursday.

Our governments aren't going to do anything about it. We all know it. In part because not enough constituents are complaining loudly enough (or, you know; at all) about these companies.

Speaking of; there is a very quick solution to these companies misbehaving: us. We don't even need to cancel our accounts (wouldn't matter anyway, they won't really delete them). Just stop using the services and let the companies know why. Let them know that when they truly take responsibility for their actions and behave then maybe we'll come back. Until then, we're refusing to be good products for them to sell.

But, let's face it, there are too many people out there that foolishly think they can't have a social life without social media despite the overwhelming evidence to prove that's not true. Too many people thinking they can't do without Amazon, Apple, Netflix, or Google, or Microsoft among many others. So, nobody will do or say anything and these companies will continue abusing us.

I mean, damn. If we can't get fired up enough to do something about one of them blatantly showing sexual material to minors then we haven't hit rock bottom from this particular drug.

ksaun|1 year ago

Besides just not using these services, wouldn't it also be necessary to divest any investments in any of these companies (even if through mutual/index funds, etc.)?

I agree it is hard to not feel defeatist.

crowcroft|1 year ago

If this is the innovation we get from Section 230, then we need carve outs.

photochemsyn|1 year ago

The hijacking of basic human emotions and drives in the name of some ulterior goal - turning a profit, establishing authority, etc. - is a notion that encompasses and also predates social media, the internet, television and radio advertising, the invention of the printing press, and the tactics of priests and kings throughout recorded history. It's the most fundamental feature of all such propaganda.

If we wanted to really undermine such efforts, we'd be teaching children about the history of such propaganda tactics and methods from a quite young age. This has been tried before in the USA, but it got shut down quickly at the beginning of WWII:

https://en.wikipedia.org/wiki/Institute_for_Propaganda_Analy...

game_the0ry|1 year ago

I am disappointed that parents allow 13 year olds to have iphones + social media accounts.

The downsides have been well-documented. [1] Can we all collectively get together and say "no" to that stuff until like 18?

[1] https://www.youtube.com/watch?v=CI6rX96oYnY

jchung|1 year ago

I have kids who will soon be teens. They don't have phones yet, but I can see the decision looming: my ability to communicate / coordinate with them plus the fact that a huge portion of their social life will migrate online on one hand vs. all of the dangers on the other.

This is why a lot of parents start with smart watches or restricted phones. They try to get the communication / coordination benefits without the online social risks. But that only lasts so long.

I'm not sure how I'll navigate it. Probably not by saying "no to that stuff until 18".

stawik22|1 year ago

I totally understand your point and others commenting here. IMHO the main problem are other parents giving their children their "old" phone because they want the latest one.

There is a finite amount of control that you can have, there's always a friend with another phone where they can watch everything.

Communication is the key, we talk a lot with our 11 yo about the danger and pitfalls of social media, electronic games etc. Yes, he has a simple smartwatch (bare minimum: call, location), yet I think we managed to develop a healthy digital hygiene. I wish you best luck!

nomel|1 year ago

I'm a strong believer that smart watches cover plenty: music, phone calls, texts, schedule/reminders, and even light reading.

2OEH8eoCRo0|1 year ago

Platforms should be liable for recommendations

When are we gonna hold these companies accountable and stop accepting excuses?

gumby|1 year ago

I am always surprised by articles like this. Instagram never recommends sexual videos to me.

tennisflyi|1 year ago

It’s very easy to train it.

1. Search “TikTok dance”

2. Scroll until one is at the beach thus bikini

3. Let it loop a couple times

4. Exit the app and comeback - that’s your new feed

laweijfmvo|1 year ago

The issue with even saying you don't like something seems to be that _everything_ contains some sort of sexualized being. Into fitness? here's a mostly naked person doing fitness. Hiking? Same. Gardening? Yup! Etc... How can the algorithm possibly disambiguate?

mandibeet|1 year ago

How different people's experiences with social media platforms can be. Instagram for me too but YouTube shorts sometimes..

xyst|1 year ago

search for "transparent clothing influencers" on a fresh account and you will get a flood of them

rdtsc|1 year ago

> In one clip that Instagram recommended to a test account identified as 13 years old, an adult performer promised to send a picture of her “chest bags” via direct message to anyone who commented on her video. Another flashed her genitalia at the camera.

When it comes encryption and privacy the legislators just can't wait to jump in an "save the children", let's see how vigorous they are going to be investigating and prosecuting Meta for showing inappropriate things to children.

> On TikTok [...] new teen test accounts that behaved identically virtually never saw such material—even when a test minor account actively searched for, followed and liked videos of adult sex-content creators.

Well, isn't that embarrassing? The evil TikTok they are trying hard to ban, and for good reasons I think, is doing a better job "protecting our children" than Meta.

pizzathyme|1 year ago

The reason for the US government banning TikTok is not that TikTok is a worse product (it is superior).

The reason is that it is a massive geopolitical risk.

People often conflate these two.

burningChrome|1 year ago

I would be interested to know how many people are coming from different social media platforms TO tikTok for this kind of content. I know back in the day almost every subreddit that had porn or women posting pics would also link to their tiktok accounts and many of the videos and accounts were also posted on tiktok so a woman could post a few pics or videos, link to their tiktok account and then get people to subscribe to their accounts there.

There's also an assumption that users are 100% honest with their age. Simply confirming you are 18 gives you an easy end around the filtering of content. Even at 8-10 I had friends who were quite ambitious about getting their hands on porn and other material we weren't supposed to have. If the bar is simply lying about your age, I would say that's not a very good way to try and filter content from underage users.

infecto|1 year ago

Even more damming imo is that Instagram is rife with those “young model” accounts.

Makes sense though. When you have a product produced in an authoritarian state, they probably spend a lot more time on censoring for better and worse.

atestu|1 year ago

Same for Snap, which is surprising given its reputation:

> Despite their systems’ similar mechanics, neither TikTok nor Snapchat recommended the sex-heavy video feeds to freshly created teen accounts that Meta did, tests by the Journal and Edelson found.

This imo proves that Meta isn't even trying:

> In some instances, Instagram recommended that teen accounts watch videos that the platform had already labeled as “disturbing.”

This could be a very simple toggle, it's disingenuous to blame everything on the "black box" of the "algorithm."

strangemonad|1 year ago

It’s almost as if it’s not a single factor issue and insta could stand to improve its under-age content filters AND TikTok can also be a threat because of its ties to China.

CommanderData|1 year ago

The TikTok ban is political and not to save the children, perhaps to stop them seeing the horrors in the middle east, instead tune them into $MindDestorying content instead.

cue_the_strings|1 year ago

I'm genuinely curious, how old were you when you were first exposed to sexual content? I think I was like 7 or 8.

Like, we had a channel (https://en.wikipedia.org/wiki/RTV_Palma) publicly broadcasting porn after midnight (I don't remember watching that) and all the older (~15YO) kids around the neighborhood were collecting porn magazines from who knows where and hiding them around in "caches".

I remember all the older kids being really excited about it, and us younglings being curious but grossed out. There was some pressure to pretend you were interested, kids love pretending they're more grown up than they are. Funnily enough, I remember having the same feeling about football world cup, like not getting the whole fuss about it, but being expected to be interested.

Couple years later, I naturally found out what they were excited about... For football I never did, though.

arp242|1 year ago

I don't think you can really compare staying up late to sneakily watch a (typically soft-core) porn and things like that to always having this content being pushed on you with a device always in your pocket every day.

It's really a matter of degree and scale.

dkarl|1 year ago

I was exposed to porn before I was even interested. I think I was seven. My friend's parents had the complete cable package that carried all the channels the local provider offered (which at the time I think was around thirty channels) and one of them showed porn. I only saw it once when I slept over, so I don't know if it was a dedicated porn channel or if it was just late night. The few minutes I saw was soft core, but looking back in retrospect, I'm pretty sure it was leading up to a hard core scene. My friend was also seven, but he seemed pretty obsessed with it. I left him to it and played video games instead, and I never slept over again.

I didn't get exposed to porn after that until a couple of high school trips where we got to stay in hotel rooms and of course figured out how to access the pay-per-view porn. But that was it. The "inspiration" for my teenaged solo sexual activities came from catalogs with models in lingerie, relatively tame TV, the Sports Illustrated swimsuit issues, tennis magazines, and my imagination. The closest thing I had to porn when I lived with my parents was pausing a rented VHS tape of an R-rated movie, on one of the rare occasions when I had the house to myself.

throwaway22032|1 year ago

When I was younger porn was a properly seedy thing that people would consume but almost no-one knew anyone that 'did it'.

Nowadays you have OnlyFans models or similar basically all over every social media platform, businesses trying to sell it as a legit lifestyle, etc. It's completely different, because it's becoming much more bidirectional and open now.

ajsnigrutin|1 year ago

But it's different... you actually had to seek porn to find it back then (like stay up past your bedtime, turn on a relatively bad tv station, and sneak sticky magazines from somewhere.

Now it's seeking you... open instagram, and you get porn ads, even if you're 13.

On the other hand, boobs in shower gel ads were pretty normal back then, and noone really got excited by them. Also asses in thongs for suntan location ( https://svetkapitala.delo.si/media/images/20200217/322547.wi... )

cmrdporcupine|1 year ago

Culture around this stuff is completely different in the Anglo-American world, for one. Europeans are accustomed to nudity and the like everywhere, and it's not even particularly ... anything.

But also, 1980s porn magazines or whatever are very different from today's porn content. A brief visit to any of the free "tube" sites exposes you immediately to a level of intensity that you wouldn't get from Playboy or Penthouse in the 80s. Right on the front page you'll find choking, incest ("stepdaughter"), BSDM, etc.

I don't particularly want my teenage son traipsing into that before the context and patterns of healthy mature sexual relationships have been established.

But more than anything, I don't want it pushed on him.

gigatexal|1 year ago

Maybe social media was a mistake. Oh well… Pandora’s box and all that

Congress won’t do anything; it’s too mired in infighting and lobbyists. And these companies’s better angels won’t do anything about this.

throwway120385|1 year ago

Maybe using opaque algorithms to score and assign prominence to specific content based on ill-defined metrics like "engagement" was a mistake.

lalos|1 year ago

I wonder if whoever ran this test prefers this type of content and its tied to a wifi network, same device, linked account, IP or location and there's no detail of the setting of the test. Too many factors going on to conclude with not a lot of transparency for reproducibility or flaws in the data collection.

jtbayly|1 year ago

It doesn’t matter. The account was marked with an age. IG pushed sexual content to it.

You can add as much other information to the picture as you want, but that’s the black and white issue.

Besides, how often do 13yo’s have their own wifi and IP address?

crowcroft|1 year ago

Ah yes, too complex to really understand what's going on so we will hand wave it away.

It should be a hard rule that people under a certain age CAN NOT have this kind of content recommended, there should be precisely 0 ways for the algorithm to promote sexual material to children.

tennisflyi|1 year ago

Sex is everywhere. The internet has a lot of sex. It has been kind of the default since inception. Wild people just now know of its presence.

octopoc|1 year ago

Sex ed via profit-seeking corporations is going to be pretty different, and arguably much more unnatural, than sex ed via parents and other people who care about you.

iamacyborg|1 year ago

There’s a pretty big difference between going out of your way to look for it vs it being served up to you in a feed.

EgregiousCube|1 year ago

It's true, it just used to be behind the scenes available to those who seek it out. Like that skeezy room at the back of the video store that had curtains blocking it off.

IMO the real "problem" here isn't technological, political, or corporate - it's just a slide of social norms toward hyper-permissibility of immodest or bad behavior. The resolution will likely be social as well, and people realizing that kids are getting exposed to smut will likely hasten that. Laws might get passed, corporations might change their policies, but only after the pendulum swings back socially.

recursive|1 year ago

You might have misunderstood if you believed this article was about people just learning that sex is on the internet.