(no title)
sparky_ | 11 months ago
Image generation models capable of generating this type of content would necessarily need to be trained on the real thing, the possession of which is inarguably illegal and immoral.
So how could the model be legally or ethically trained? And if they _cannot_ be legally or ethically trained, then how can the _use_ of those models be okay?
What will be the implications of this in cases where _real_ CSAM was produced or possessed? Certainly this opens the door to a whole plethora of new "it's AI art, I swear!" defenses. After all, how can one definitely prove that CSAM is authentic or not, unless the chain of production is verified?
From the article: > ...If purely private possession of AI-CSAM is constitutionally protected under current caselaw but production is not, then using AI models (even locally-hosted ones) to generate child obscenity in one’s own home is not wholly insulated from criminal prosecution. Subsequently transmitting it to someone else, especially someone underage, is also grounds for liability...
Can of worms, ye be released!
braiamp|11 months ago
wavemode|11 months ago
https://youtu.be/160F8F8mXlo
Diffusion models do posses some capability to synthesize ideas, but that capability does not necessarily generalize to every possible use case. So it's impossible to say for certain that that is what is happening.
delecti|11 months ago
LinuxBender|11 months ago
unknown|11 months ago
[deleted]
anigbrowl|11 months ago
roenxi|11 months ago
I doubt that is so. In practice they might be trained on the real thing, but models generalise pretty well. It is going to be technically possible to train a model on other material (children, nudity and non-CSAM abuse scenes or maybe not even that) and have it generate CSAM.
But even if it was true, that would only make training the model illegal and ethically dubious. We use a tonne of technologies where the creator was legally and morally dubious. It's never been an ongoing issue before. So once the model is created there isn't a good reason to encumber it by how it was created.
davisp|11 months ago
> So once the model is created there isn't a good reason to encumber it by how it was created.
I am trying to be very specific here. I assume no untoward motivations from the parent commenter. I am not intending to cast aspersions. Whoever wrote this, I feel no ill will for you and this is not meant as a personal slight.
And I will be very clear, this statement as written could probably be defended because of the “by how it was created” clause.
However, “So once the model is created there isn’t a good reason to encumber it” is so… fucking I don’t even know, because what the actual fuck?
I apologize for the profanity, I really do. But, really? Are you fucking kidding me?
These models should not exist. Ever. By any means. Do not pass Go. Go directly to jail.
I understand the engineering brain enough to contemplate abstract concepts with detachment. That’s all I think happened here. But holy fuck, please pause and consider things a bit.
RajT88|11 months ago
You are probably right, given what we saw with all the porn popup adware back in the 90's and 2000's. A friend of mine was a malware analyst for the FBI for a while.
All CSAM possession cases she heard about, the defense was "malware did it". Nearly all cases the jury convicted them. 100% of her cases for sure.
At some point using the defense everyone else uses and fails with is probably going to become a liability. Shit I am sure people are already trying to use this defense and failing!
grepfru_it|11 months ago
Similarly, I think using the AI art excuse may be an uphill battle but not one that is impossible to defend
JKCalhoun|11 months ago
Sick people will still go to jail.
dragonwriter|11 months ago
This is absolutely not true. Generalization is a key capability of image generation models.
> Certainly this opens the door to a whole plethora of new "it's AI art, I swear!" defenses.
The worst justification for a criminal prohibition that I can think of is that it is provides a convenient out for the difficulty of proving another, more clearly warranted, crime.
> After all, how can one definitely prove that CSAM is authentic or not, unless the chain of production is verified?
"Beyond a reasonable doubt" is not, and never had been, "definite".