top | item 46869109

(no title)

pogue | 27 days ago

Finally, someone is taking action against the CSAM machine operating seemingly without penalty.

discuss

order

tjpnz|26 days ago

It's also a massive problem on Meta. Hopefully this action isn't just a one-off.

direwolf20|26 days ago

Does Meta publish it themselves or is it user–generated?

chrisjj|27 days ago

[deleted]

mortarion|27 days ago

CSAM does not have a universal definition. In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response. If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.

No abuse of a real minor is needed.

moolcool|27 days ago

Are you implying that it's not abuse to "undress" a child using AI?

You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools. Just because these images are "fake" doesn't mean they're not abuse, and that there aren't real victims.

secretsatan|27 days ago

It doesn't mention grok?