(no title)
moofight | 6 years ago
Which is why automated Image/Video moderation solutions (such as Vision, Rekog, Sightengine.com, Hive) will continue to grow. Not only because it is cheaper/faster, but because it becomes a necessity. Or at least as a first filter to weed out the "worst" content.
mandevil|6 years ago
Now, they felt much more empowered than what Facebook was doing: they kept going because the goal was to stick cuffs on the wrists of the guys who were doing this, and get those kids away from him, and they could put up with all of the rest for that goal. They were treated as rockstars by the rest of the people they interacted with, because they were the ones who got kids away from the predators. They had frequent opportunities to take breaks and could set their own schedule, with only the guilt that came from the longer they delayed, the more time passed with the kids in the predators hands to drive them.
Ultimately, feeling empowered to make a difference in the world is key, and if Facebook treated screening as an important job and gave their moderators more power to set their own working conditions I suspect that it would improve their mental health by quite a bit.
hos234|6 years ago
I hope they are investing in an army of shrinks /psychologists/sociologists to study,improve and supervise these centers, cause this stuff is not going away by just deleting content.
commandlinefan|6 years ago
[deleted]
jbattle|6 years ago
roguecoder|6 years ago
The problem is that context is even more important in visual content than in textual content, and we still don’t have any algorithms that can parse context as successfully as humans can.
leetcrew|6 years ago