>In many cases, Instagram has permitted users to search for terms that its own algorithms know may be associated with illegal material. In such cases, a pop-up screen for users warned that “These results may contain images of child sexual abuse,” and noted that production and consumption of such material causes “extreme harm” to children. The screen offered two options for users: “Get resources” and “See results anyway.”
>In response to questions from the Journal, Instagram removed the option for users to view search results for terms likely to produce illegal images. The company declined to say why it had offered the option.
I don't know what tags were in question but there are a lot of really-ambiguous terms out there, a lot of which get co-opted arbitrarily, so it's not outrageous to have allowed a click-through wherever doubt existed.
Minor examples: I tagged something saying only a #sissy would use it, but the only other content posted under that tag is feminization psyops porn.
#lolita should be explicitly outrageous, but it seems we use it to describe fashion style for both prepubescent children and the creepy adults who dress like them.
#FKK is some German acronym about nudism and families or something. I don't know if that's even allowed on Instagram but it comes up a lot on other platforms, usually adjacent to child porn.
The notorious Children of God cult used to deliberately conflate names of acts of sexual abuse with innocuous terms like "Bible study" so children reporting to outsiders would sound like idiots. "Please help, they make me do Bible study..."
This sort of induced confusion is how you get away with operating in plain sight.
I'm just going to say it. Aspects of the article resemble practically a HOWTO guide. The comments from sites and the resulting lack of true changes when reported on make it clear the green light is lit and everyone is being told full speed ahead.
If you're actually an active pedophile then I'm sure you're already well aware of any tactics for accessing illicit content by the time they're showing up in the WSJ. For everyone else, the only way to understand the full depth of the problem and how pervasive it is, is to understand and examine the methods these people are using to connect and distribute that illicit content.
Unfortunately, the main thing that will come out of this will be calls to monitor and sensor online activity even more, not anything that will actually protect children.
SIO covers this as if they are fixing software bugs. No rage against the machine. They have become part of the machine. We have such a fit for nothing intellectual class.
I'm surprised and confused somebody would use a public-facing website like Instagram to do something so illegal and taboo. Wouldn't they use Tor or at least a private Telegram chat? What, do they just paypal the money?
Seems like a recipe for getting arrested.
EDIT: The article says a lot of the accounts are supposed to be run by kids themselves so maybe that has something to do with it?
Twitter is has an even bigger problem. I haven't seem pedo stuff, but there is far more explicit imagery. Unlike Insta, Twitter allows nudity and pornography, and it seems to be a haven for anything and everything. I've seen blue checks posting long form pornography, and it's not all great.
mrguyorama|2 years ago
What in the actual Fuck
jstarfish|2 years ago
Minor examples: I tagged something saying only a #sissy would use it, but the only other content posted under that tag is feminization psyops porn.
#lolita should be explicitly outrageous, but it seems we use it to describe fashion style for both prepubescent children and the creepy adults who dress like them.
#FKK is some German acronym about nudism and families or something. I don't know if that's even allowed on Instagram but it comes up a lot on other platforms, usually adjacent to child porn.
The notorious Children of God cult used to deliberately conflate names of acts of sexual abuse with innocuous terms like "Bible study" so children reporting to outsiders would sound like idiots. "Please help, they make me do Bible study..."
This sort of induced confusion is how you get away with operating in plain sight.
gmerc|2 years ago
mdrzn|2 years ago
annexrichmond|2 years ago
bacchusracine|2 years ago
...is this an advertisement for perverts?
SpaceBuddha|2 years ago
smittywerben|2 years ago
secalex|2 years ago
version_five|2 years ago
ksey3|2 years ago
lilboiluvr69|2 years ago
Seems like a recipe for getting arrested.
EDIT: The article says a lot of the accounts are supposed to be run by kids themselves so maybe that has something to do with it?
BasedAnon|2 years ago
slimebot80|2 years ago
Twitter is has an even bigger problem. I haven't seem pedo stuff, but there is far more explicit imagery. Unlike Insta, Twitter allows nudity and pornography, and it seems to be a haven for anything and everything. I've seen blue checks posting long form pornography, and it's not all great.
unknown|2 years ago
[deleted]
lilboiluvr69|2 years ago
[deleted]