top | item 38266167

(no title)

vasdae | 2 years ago

"Nonconsensual AI porn" is a weasel term because it implies that it should be necessary to get someone's consent to create fake porn using their faces.

discuss

order

bobthepanda|2 years ago

It definitely should be. I don’t want my face used for porn if I just randomly shot a stock photo.

Controlling likenesses in AI was the whole point of the SAG strike.

brucethemoose2|2 years ago

What if I hired a human artist to hand draw the fake instead?

What if they used Photoshop instead of drawing from a reference?

dragonwriter|2 years ago

> Controlling likenesses in AI was the whole point of the SAG strike.

No, it wasn't.

It was one important issue in the strike, but there were others (streaming residuals were a big issue, for instance.)

robertlagrant|2 years ago

> Controlling likenesses in AI was the whole point of the SAG strike.

I think they had 5 or 6 demands.

a_wild_dandan|2 years ago

Well said. If you fancy using my likeness as a dartboard, or in a meme, or as a Photoshop asset, or painted on a canvas, or drawn by AI, or mistakenly randomly generated, etc, great! Have fun. Not my circus, not my monkeys.

I'm not entitled to categorically own/forbid using a look. That's nonsense and leads to self-inflicted quandaries: How do I know a video of unknown provenance contains me, not a dead ringer that gave consent? How different must a depiction be to not require my consent? 9 pixels? 30%, whatever that means? At least an eye color change?

It's impossible to consistently enforce, presumptive, and effectively thought-policing a concept. In short: it's absurd.

tshaddox|2 years ago

> How do I know a video of unknown provenance contains me, not a dead ringer that gave consent?

> It's impossible to consistently enforce, presumptive, and effectively thought-policing a concept. In short: it's absurd.

I mean, come on. It’s fine to disapprove of the law, but this isn’t some uniquely difficult thing that the legal system couldn’t possibly handle. It’s certainly nowhere near the level of complexity and ambiguity of, for instance, criminal fraud law, where things like the intent of the accused and the “reasonable person” are routinely crucial elements.

lesuorac|2 years ago

dragonwriter|2 years ago

“In the United States, the right of publicity is a state law–based right, as opposed to federal, and recognition of the right can vary from state to state.”

So, the USA-specific answer is depends on the specific US state(s) whose law relevant to the action in controversy.

There are countries with national rights in this area, but the USA is (and your source highlights this) not one of them.

arrosenberg|2 years ago

It unequivocally should. Stop abusing peoples' privacy or you are going to get your toys taken away...

PheonixPharts|2 years ago

The irony of course is that people are only able to create deepfakes of non-celebrities because social media has already gotten the average user very comfortable with letting go of their privacy.

vasdae|2 years ago

Taking a pornographic movie and putting someone's face in place of one of the actor's does not violate their privacy in any way, since nothing private was shared that wasn't private before (their face).

nmjohn|2 years ago

> because it implies that it should be necessary

Who gets to decide what should or should not be necessary? Do you think your opinion about this is the majority view by people?

tshaddox|2 years ago

Any term given to a specific criminal action will often be used to refer to that action with the implication that it’s a crime, yes.

bigbillheck|2 years ago

It should be necessary to get someone's consent to create fake porn using their faces.

badrequest|2 years ago

It should, obviously, be necessary to do this.