(no title)
jbrennan | 5 years ago
As far as “worth it” goes, some people have to be exposed to it so long as we have law enforcement (but I’m certainly open to alternatives here). I’m not sure the train operator is a fair comparison, because seeing a suicide is an exceptional circumstance in their job, it’s not the norm. The content moderators, however, are sadly expected to be exposed to traumatizing content as part of their job description — it’s essentially the point of their job.
There are plenty of kinds of work we deem as hazardous to people’s health, and thus are either banned or regulated. I’m not sure if there’s a healthy way to expose people in these moderator jobs to the traumatizing content they face. It just doesn’t seem worth the tradeoff to endanger them like this.
dangus|5 years ago
> [shut down] just large / public sites that require this sort of moderation
Let’s say I start a restaurant review website that allows comments and photos to be uploaded. It does modest business for a while, I now have 50 employees. I’m following the law because my site isn’t big enough to violate this “no user content for big prominent websites” law.
Soon, it becomes big, like a major competitor to Yelp, and I’ve got 1,000 employees. But suddenly, this new law kicks in that says that I have to stop accepting uploads because my site is too high profile. Now, I lay everyone off and go out of business.
This just isn’t a workable solution, at least not in the particular way you’re proposing it be constructed.
And really, you’re asking the second largest advertiser on the web (Facebook), a Fortune 50 company, to just pack up its bags and shut down.
It’s not like I love Facebook or anything, but I’m sure their 45,000 employees wouldn’t be happy about that.