top | item 36225544

Instagram Connects Vast Pedophile Network

51 points| stereoradonc | 2 years ago |wsj.com

16 comments

order

mrguyorama|2 years ago

>In many cases, Instagram has permitted users to search for terms that its own algorithms know may be associated with illegal material. In such cases, a pop-up screen for users warned that “These results may contain images of child sexual abuse,” and noted that production and consumption of such material causes “extreme harm” to children. The screen offered two options for users: “Get resources” and “See results anyway.” >In response to questions from the Journal, Instagram removed the option for users to view search results for terms likely to produce illegal images. The company declined to say why it had offered the option.

What in the actual Fuck

jstarfish|2 years ago

I don't know what tags were in question but there are a lot of really-ambiguous terms out there, a lot of which get co-opted arbitrarily, so it's not outrageous to have allowed a click-through wherever doubt existed.

Minor examples: I tagged something saying only a #sissy would use it, but the only other content posted under that tag is feminization psyops porn.

#lolita should be explicitly outrageous, but it seems we use it to describe fashion style for both prepubescent children and the creepy adults who dress like them.

#FKK is some German acronym about nudism and families or something. I don't know if that's even allowed on Instagram but it comes up a lot on other platforms, usually adjacent to child porn.

The notorious Children of God cult used to deliberately conflate names of acts of sexual abuse with innocuous terms like "Bible study" so children reporting to outsiders would sound like idiots. "Please help, they make me do Bible study..."

This sort of induced confusion is how you get away with operating in plain sight.

gmerc|2 years ago

Well this opens them to an adversarial attack with no recourse for users.

annexrichmond|2 years ago

So is Apple going to remove Instagram from the App Store like they did with Tumblr?

bacchusracine|2 years ago

I'm just going to say it. Aspects of the article resemble practically a HOWTO guide. The comments from sites and the resulting lack of true changes when reported on make it clear the green light is lit and everyone is being told full speed ahead.

...is this an advertisement for perverts?

SpaceBuddha|2 years ago

If you're actually an active pedophile then I'm sure you're already well aware of any tactics for accessing illicit content by the time they're showing up in the WSJ. For everyone else, the only way to understand the full depth of the problem and how pervasive it is, is to understand and examine the methods these people are using to connect and distribute that illicit content.

smittywerben|2 years ago

It's not some secret when more kids know this happens than parents. It's been full speed ahead for quite some time.

version_five|2 years ago

Unfortunately, the main thing that will come out of this will be calls to monitor and sensor online activity even more, not anything that will actually protect children.

ksey3|2 years ago

SIO covers this as if they are fixing software bugs. No rage against the machine. They have become part of the machine. We have such a fit for nothing intellectual class.

lilboiluvr69|2 years ago

I'm surprised and confused somebody would use a public-facing website like Instagram to do something so illegal and taboo. Wouldn't they use Tor or at least a private Telegram chat? What, do they just paypal the money?

Seems like a recipe for getting arrested.

EDIT: The article says a lot of the accounts are supposed to be run by kids themselves so maybe that has something to do with it?

BasedAnon|2 years ago

Because the risk of getting caught is still quite low, and using a well known website gives you a bigger customer base

slimebot80|2 years ago

It's very odd the Musk is jumping on this.

Twitter is has an even bigger problem. I haven't seem pedo stuff, but there is far more explicit imagery. Unlike Insta, Twitter allows nudity and pornography, and it seems to be a haven for anything and everything. I've seen blue checks posting long form pornography, and it's not all great.