I think the ethics here get complicated. for me the line would be if the AI itself was trained on actual CSAM. as long as no one was sexually violated in the course of creating the final image, I see no problem with it from an ethical perspective; all the better if it keeps potential predators from acting on real children. Wether it does or not is a complex topic that I won't claim to have any kind of qualifications to address.
croes|19 days ago
The big question is if, those pictures could have the opposite effect.
mrighele|19 days ago
This means that a ban caused more harm on real children.
delecti|19 days ago
pdpi|19 days ago
chii|19 days ago
hansvm|19 days ago
alexgieg|19 days ago
The same has been shown to be the case with depictions of sexual abuse. For some it leads the person to go out and do it. For the majority of those predisposed to be sexual predators it "satisfies" them, and they end up causing less harm.
Presumably the same applies to pedophiles. I remember reading a study on this that suggested this to be the case, but the sample size was small so the statistical significance was weak.