>> Experts said that even photos that are partially obscured, such as the image shared by the influencer, typically qualify as illegal child sex abuse material, or CSAM.
>> In fact, the image in question had drawn more than 3 million views and 8,000 retweets, according to Twitter statistics on a cached version of the tweet from Tuesday.
* Researchers at the Stanford Internet Observatory say the company failed to deal with 40 items of child sexual abuse material (CSAM) over a period of two months between March and May this year.
* Research such as this is about to become far harder—or at any rate far more expensive—following Elon Musk’s decision to start charging $42,000 per month for its previously free API.
DoesntMatter22|2 years ago
_kbh_|2 years ago
How is Twitter unbanning someone, who was banned for posting child pornography, being more aggressive in getting rid of it?.
https://www.washingtonpost.com/technology/2023/07/27/twitter...
>> Experts said that even photos that are partially obscured, such as the image shared by the influencer, typically qualify as illegal child sex abuse material, or CSAM.
>> In fact, the image in question had drawn more than 3 million views and 8,000 retweets, according to Twitter statistics on a cached version of the tweet from Tuesday.
defrost|2 years ago
It would be more accurate to say that they have been more aggressive in preventing third party observers to track CSAM on twitter:
Twitter Failing To Deal With Child Sexual Abuse Material, Says Stanford Internet Observatory (Jun 6, 2023,)
https://www.forbes.com/sites/emmawoollacott/2023/06/06/twitt...
* Researchers at the Stanford Internet Observatory say the company failed to deal with 40 items of child sexual abuse material (CSAM) over a period of two months between March and May this year.
* Research such as this is about to become far harder—or at any rate far more expensive—following Elon Musk’s decision to start charging $42,000 per month for its previously free API.
dude187|2 years ago