top | item 46870988

(no title)

moolcool | 26 days ago

Are you implying that it's not abuse to "undress" a child using AI?

You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools. Just because these images are "fake" doesn't mean they're not abuse, and that there aren't real victims.

discuss

order

chrisjj|26 days ago

> Are you implying that it's not abuse to "undress" a child using AI?

Not at all. I am saying just it is not CSAM.

> You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools.

Its terrible. And when "AI"s are found spreading deepfakes around schools, do let us know.

mrtksn|26 days ago

CSAM: Child Sexual Abuse Material.

When you undress a child with AI, especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated. Therefore CSAM.

enaaem|26 days ago

[deleted]