More precisely, she wants the distribution of nudes without consent, real or faked, to be considered an offense. This isn't about the tool, but what's being done with it.
That doesn't seem like an accurate summary. The actual quotes address the generation of those images, not just the distribution:
> "Parliament needs to have the opportunity to debate whether nude and sexually explicit images generated digitally without consent should be outlawed, and I believe if this were to happen the law would change."
And it's about restricting the tools, not just what's done with them:
> "If software providers develop this technology, they are complicit in a very serious crime and should be required to design their products to stop this happening."
This is an important difference, and, for me at least, the difference between disagreeing (“ban the tool”) and agreeing (“make non-consensual nude distribution illegal”) with the proposal.
It’s a shame the headline communicates the former when it seems the proposal is the latter.
I think outlawing this tool would be counter-productive.
One of the reasons nudes being released is damaging is because it's a rare enough and noteworthy event. If because of this tool everyone has nudes of them floating around then it would become a normal thing and would actually remove most of the damage around real nudes leaking by providing plausible deniability (assuming anyone ever cares at this point - if the world is drowning in nudes of everyone, the real thing will probably go unnoticed anyway).
Outlawing the tool wouldn't actually stop malicious usage of it but because only criminals would use it it would make its (rarer) usage more damaging than if anyone could legally use such a tool and nudes stop being a noteworthy event.
In America, the federal child pornography law applies only to depictions of an actual child (and you have to know it, for possession offenses, though that’s another matter). But the Justice Department has long taken the position that an image of a clothed child that’s altered to then make the child look nude—-they used to call these “morphed” images—-counts. I don’t think it’s ever been definitively resolved by the Supreme Court, and I don’t know what the courts of appeals have said, but tools like DeepSukebe have made that argument way more appealing. I’d bet that this is where regulation will begin: images of children. That has always been a domain where American courts have been extremely reluctant to intervene; for example, any visual depiction of a seventeen year old engaged in sex is proscribable without resort to the ordinary inquiry into whether the work as a whole is “obscene,” etc.
But under reigning American First Amendment law, it gets a lot harder to explain why a law like the one being proposed here would be acceptable. The Supreme Court has, for example, held that the distribution of animal-cruelty videos cannot be forbidden. And it’s not clear to me how one could proscribe the distribution of an imaginary visual depiction of an adult who was nude. You could call it defamatory, I suppose, but if it’s concededly fictional… I don’t know.
Actually SCOTUS has ruled that laws banning obscene material are permitted under the 1st amendment.
There is 3 part test.
The SCOTUS actually heard appeals on the "obscenity" of material on a case by case basis for a while decades back.
More specific to this case is the PROTECT act [1]. I don't know whether it's every been ruled against or whether SCOTUS has accepted that all depictions of minors are obscene...
I mean, for decades you couldn't swear on TV (and still can't), so I feel like there's a "technically legal to possess, but illegal to send on the internet" avenue.
It will be interesting to see how this kind of thing plays out. I’m sure it’s quite distressing if a tool like this is used on your photo and then potentially shared in your friendship group. Hopefully we very quickly get to the point that nobody will ever be able to know if a photo is real or fake and it’s just not considered an issue therefore. Policing it seems like it would be extremely difficult. Maybe we police the intent? In other words you can produce the images but if you use it maliciously against a person then there is a crime.
> Hopefully we very quickly get to the point that nobody will ever be able to know if a photo is real or fake and it’s just not considered an issue therefore.
The ongoing "fake news" crisis proves that we are, in general, bad at spotting fakes - that is the purpose of a fake - and will also vehemently disagree about what is fake. Especially when it can be used for political purposes. Expect the first world leader brought down by a deepfake in the next decade.
I'm wondering if there are existing laws that would cover the intent part. Defamation and harassment laws etc. Maybe they just need amending. Trying to police the apps exact functionality directly seems counter productive.
I remember when deepfakes first was released there was a group who would deepfake coworkers, Facebook friends, etc for a really low cost (like $100) as long as the target had a few hundred public photos.
This is without consent as well, but it’s also not real. It seems like the equivalent of imagining people nude. Kind of creepy if I know it’s happening but not truly a violation of my privacy.
There is a continuum there between harmless to deeply offensive. The exact location in the continuum will at the very least depend on the person being subjected to this treatment and the cultural context.
The "AI" aspect will amplify the offense because of how life-like the end result can be.
I'm guessing that won't help even if it becomes a reality. These days even proper paternity tests are useless if one end up on a birth certificate without his knowledge. You're still liable even if not guilty.
Edit: That being said... this naked faking app can cause of lot of problems in the workplace and home.
Depends if they care about commercial or non-commercial stuff. Banning commercial use of software is fairly easy (tell the gatekeepers to disallow it, or at the very least to let the police find who did it); banning non-commercial isn’t strictly impossible, but I recon to be effective it would need governments to treat international internet connections like any other international border, and they are not prepared for the side-effects of doing that.
Most people don’t grok computers, so a commercial ban would probably cover most people, yet Bush-bin-Laden photoshop I saw back in the early noughties would still get made and shared.
I believe Virgina has already passed legislation so broad and abstract as to make photoshop and other image editing tools illegal. Like most laws of this type it's not about stopping an immoral or illegal activity. It's about control of people in power's public image. It will only be enforced if you piss off someone rich or powerful.
[+] [-] JulianMorrison|4 years ago|reply
[+] [-] jsnell|4 years ago|reply
> "Parliament needs to have the opportunity to debate whether nude and sexually explicit images generated digitally without consent should be outlawed, and I believe if this were to happen the law would change."
And it's about restricting the tools, not just what's done with them:
> "If software providers develop this technology, they are complicit in a very serious crime and should be required to design their products to stop this happening."
[+] [-] pjc50|4 years ago|reply
> A person (“A”) commits an offense if—
> (a)A discloses, or threatens to disclose, a photograph or film which shows, or appears to show, another person (“B”) in an intimate situation,
> (b)by doing so, A intends to cause B fear, alarm or distress or A is reckless as to whether B will be caused fear, alarm or distress, and
> (c)the photograph or film has not previously been disclosed to the public at large, or any section of the public, by B or with B's consent
(my highlighting: "appears to show" would cover realistic fakes).
[+] [-] xibalba|4 years ago|reply
It’s a shame the headline communicates the former when it seems the proposal is the latter.
[+] [-] buro9|4 years ago|reply
Especially if "this image is similar to me" is factored in, what degree of similarity makes an image a representation of a real person?
We're not far from this being less about privacy and dignity, and being more about whether the idea of nudity is permitted.
[+] [-] Nextgrid|4 years ago|reply
One of the reasons nudes being released is damaging is because it's a rare enough and noteworthy event. If because of this tool everyone has nudes of them floating around then it would become a normal thing and would actually remove most of the damage around real nudes leaking by providing plausible deniability (assuming anyone ever cares at this point - if the world is drowning in nudes of everyone, the real thing will probably go unnoticed anyway).
Outlawing the tool wouldn't actually stop malicious usage of it but because only criminals would use it it would make its (rarer) usage more damaging than if anyone could legally use such a tool and nudes stop being a noteworthy event.
[+] [-] isamuel|4 years ago|reply
But under reigning American First Amendment law, it gets a lot harder to explain why a law like the one being proposed here would be acceptable. The Supreme Court has, for example, held that the distribution of animal-cruelty videos cannot be forbidden. And it’s not clear to me how one could proscribe the distribution of an imaginary visual depiction of an adult who was nude. You could call it defamatory, I suppose, but if it’s concededly fictional… I don’t know.
[+] [-] LatteLazy|4 years ago|reply
There is 3 part test.
The SCOTUS actually heard appeals on the "obscenity" of material on a case by case basis for a while decades back.
More specific to this case is the PROTECT act [1]. I don't know whether it's every been ruled against or whether SCOTUS has accepted that all depictions of minors are obscene...
[1] https://en.wikipedia.org/wiki/Legal_status_of_fictional_porn...
(see the US under Grey Area.)
[+] [-] dannyw|4 years ago|reply
[+] [-] justincormack|4 years ago|reply
[+] [-] hackeraccount|4 years ago|reply
[+] [-] basisword|4 years ago|reply
[+] [-] pjc50|4 years ago|reply
The ongoing "fake news" crisis proves that we are, in general, bad at spotting fakes - that is the purpose of a fake - and will also vehemently disagree about what is fake. Especially when it can be used for political purposes. Expect the first world leader brought down by a deepfake in the next decade.
I'm reminded of https://www.theverge.com/2017/7/12/15961354/pakistan-calibri...
[+] [-] have_faith|4 years ago|reply
[+] [-] prepend|4 years ago|reply
I remember when deepfakes first was released there was a group who would deepfake coworkers, Facebook friends, etc for a really low cost (like $100) as long as the target had a few hundred public photos.
This is without consent as well, but it’s also not real. It seems like the equivalent of imagining people nude. Kind of creepy if I know it’s happening but not truly a violation of my privacy.
[+] [-] hardlianotion|4 years ago|reply
It's not quite the equivalent of imagining people nude as there is an artefact that can potentially cause its own pain when distributed.
[+] [-] Joakal|4 years ago|reply
[+] [-] sorokod|4 years ago|reply
The "AI" aspect will amplify the offense because of how life-like the end result can be.
[+] [-] dannyw|4 years ago|reply
But something about nudity makes it different.
[+] [-] knipster|4 years ago|reply
OR
Streisand effect intentional to make this so common that it no longer draws attention
[+] [-] laurowyn|4 years ago|reply
[+] [-] Tycho|4 years ago|reply
[+] [-] White_Wolf|4 years ago|reply
Edit: That being said... this naked faking app can cause of lot of problems in the workplace and home.
[+] [-] tester34|4 years ago|reply
it removes all overhead of DNA tests
[+] [-] dannyw|4 years ago|reply
Reality is, unless most countries ban it, it's gonna be on the internet.
[+] [-] ben_w|4 years ago|reply
Most people don’t grok computers, so a commercial ban would probably cover most people, yet Bush-bin-Laden photoshop I saw back in the early noughties would still get made and shared.
[+] [-] uCantCauseUCant|4 years ago|reply
[deleted]
[+] [-] villgax|4 years ago|reply
[+] [-] superkuh|4 years ago|reply
https://law.lis.virginia.gov/vacode/title18.2/chapter8/secti...
[+] [-] arbitrage|4 years ago|reply
Where's the outcry over Photoshop?