top | item 46223291

(no title)

pizza | 2 months ago

The platform operators have a responsibility to remove garbage from their site. I don’t see how it’s better if adults are the recipients of these alleged harms. And I definitely don’t see how the platform operators are going to clean up their act if — rather than being penalized — they can pretend that the problem has vanished into thin air because a specific category of vulnerable users is now de jure disappeared.

discuss

order

KaiserPro|2 months ago

> rather than being penalized

The problem is, currently doing any kind of content filtering, as in making illegal stuff hard to find, and having a moderated semi walled garden, plays right into the noisy fuckers brigade.

If I were to design a TV programme which is aimed at 11-16 year olds, where I just play soft porn every 15 seconds, offer guides on how to do financial scams, and encourage the children to hide away from their parents as they watch. it would be banned instantly, regardless of how much "good" content I put in there.

People would say it's irresponsible to expose kids of that age to such things.

Yet, here we have social media doing just the same.

The reason why we make it illegal to beat kids, sell them smokes, drugs, booze and generally treat them like shit, is because we want well rounded functioning kids who are able to live a long an illustrious life as part of society.

Giving them a device that feeds them war, porn, rage bait, and huge lies, all for the profit of a few hundred people in america seems somewhat misguided.

whimsicalism|2 months ago

I'm glad when I was a teenager the adults in my life were less concerned with protecting me from wrongthought. Are modern teenagers more or less credulous consumers of information than adults, I wonder.

roguecoder|2 months ago

In America, we haven't made it illegal to assault children. We should, but we haven't.

anonymous_sorry|2 months ago

In the same way it's better that adults are the recipients of the harms of smoking, drinking or gambling. It's still not desirable, but societies have settled upon thresholds for when people have some capacity to take responsibility for their choices.

Not saying those thresholds are always right and should definitely apply in this case, but it surely isn't an alien or non-obvious concept.

ricardobeat|2 months ago

Adults love 'garbage'. How do you define that?

There is also the problem that making platforms responsible for policing user-generated content 1) gives them unwanted political power and 2) creates immense barriers to entry in the field, which is also very undesireable.

pizza|2 months ago

I have no idea how to define it. I also don’t know if I’m personally convinced one way or another about the harms. Just think the platforms would probably have to be made to make more substantial changes were it the case.

AlexandrB|2 months ago

I don't want Mark Zuckerberg, or the government, deciding what's garbage. If they can empower the user to filter this stuff out on their own accord, that's great.

The second problem is that the medium itself is garbage. Algorithmic feeds strongly encourage clickbait and sensationalism. Removing content does nothing to change the dynamic.

dlisboa|2 months ago

So, do absolutely nothing is your plan?