(no title)
u12
|
3 months ago
> Meta required users to be caught 17 times attempting to traffic people for sex before it would remove them from its platform, which a document described as “a very, very, very high strike threshold."
I don’t get it. Is sex trafficking driven user growth really so significant for Meta that they would have such a policy ?
aprilthird2021|3 months ago
FireBeyond|3 months ago
Like Apple's "scanning for CSAM", and people said "Oh, there's a threshold so it won't false report, you have to have 25+ images (or whatever) before it will"... Like okay, avoid false reporting, but that policy is one messy story away from "Apple says it doesn't care about the first 24 CSAM images on your phone".
SpicyLemonZest|3 months ago
delis-thumbs-7e|3 months ago
We can speculate. I think they just did not give a fuck. Usually limiting grooming and abuse of minors requires limiting the access of those minors to various activities on the platform, which means those kids go somewhere else. Meta specifically wanted to promote it’s use among children below 13 to stimulate growth, that often resulting in the platform becoming dangerous for minors was not seen as their problem.
If your company is driven by growth über alles à la venture capitalism, it will mean the growth goes before everything else. Including child safety.
FireBeyond|3 months ago
> I think they just did not give a fuck.
It's that people like Zuck and Sandberg were just so happily ensconced in their happy little worlds of private jets and Davos and etc., that they really could not care less if it wasn't something that affected them (and really, the very vast majority of issues facing Meta, don't affect them, only their bonuses and compensation).
Your actions will lead to active harm? "But not to me, so, so what, if it helps our numbers".