top | item 36903642

The Dark Side of A.I.: Understanding the Dangers of Deepfake Images

23 points| piyushsthr | 2 years ago |blog.theabhishek.dev | reply

56 comments

order
[+] Keejazz|2 years ago|reply
This article, seems biased, lacking of seeing the rest of the bigger picture problems with deepfakes, and it fails to produce any proof for any of its claims. The picture comparison, is laughable, at best (especially if you have tried to make a perfect deepfake--- it is actually not THAT easy).

There is absolutely nothing new, added to this discussion, by this article. It also seem gender provocative and charged, as the article choose to exclude male deepfakes as an equal probable problem, without even commenting on it? I suspect deepfakes to be just as valid for turning voters, blackmail, false imprisonment, and the likes - and that would include both genders.

This feels like bait/trash journalism, to me - On a topic that is worth a real discussion, in regard to real life problems,

[+] psychoslave|2 years ago|reply
>What we can do? > >In this situation, the responsibility to combat deepfakes largely lies with the government and big tech companies.

What you should also do, dear, is assume your government is corrupted and that big tech companies are soulless Leviathans motivated only by extracting some easily quantifiable monetary profit from you, abusing you as much as it can within its accounting. At worst, you were wrong and actually your local government is only steered by competent benevolent people that will understand that trust doesn’t come without checks, and big tech organizations are actually all non-profit in disguise which are striving to make the world a better place for every single living being.

Also, the key issue here, as it at always been, is how people behave with each other, not which tools they use to actually implement these interactions.

The real challenge is not that much "how do you regulate tools", but "how do you make people act with mutual respect at societal level".

[+] concordDance|2 years ago|reply
I struggle to see the harm in deep fake pornogra9hy as long as people realize its fake.

Am I missing something?

[+] Version467|2 years ago|reply
Don't you think it feels incredibly degrading and violating to have your face stitched into porn in a way that's supposed to be as realistic as possible? Without your consent?

Doesn't really matter if the video is real, the people who this is done to have not signed up for this. And you know people are jerking off to it. I don't doubt for a second that it's absolutely repulsive to see yourself in a deepfake porn.

A couple of popular streamers who have had that happened to them have reported feeling similar feelings of shame and violation to the way they felt when they experienced actual sexual assault. That's pretty alarming and should count for something, no?

Also, this gives a completely new dimension to parasocial relationships. There are lots of people struggling with real realtionships that find solace in live streams. As they continue to engage in the community, they can begin to feel like they have an actual relationship with those streamers, which is obviously dangerous since it's such a one-sided thing. Deepfake porn elevates this even further and makes it that much more likely for people to develop an unhealthy obsession with people that they will likely never meet, instead of working on their real interpersonal relationships.

[+] rawrawrawrr|2 years ago|reply
I think the opposite - once deep fake porn becomes good enough, blackmailing/revenge porn will no longer be a thing. Anyone can just say that those images are fake, and porn of celebrities, exes, politicians will all be worthless.
[+] drsopp|2 years ago|reply
I guess the problem is when people don't realize it's fake. There is a lag between people who are familiar with this technology and those who are not.

Edit: I worked in school over a decade, and I can definitely see the huge potential for harm here with digital bullying.

[+] DrJokepu|2 years ago|reply
Quite apart from any moral considerations that other commenters have addressed, in cold business terms such deep fakes directly compete with the works of these models and therefore it’s not in their interest to permit their circulation.
[+] 7speter|2 years ago|reply
It has the power to deceive and leave a lasting impression on those who may see it in passing and not realize its fake. Also, it can be demeaning considering the faked subjects would have no say consent-wise in the distribution of the faked content.

Think of what happened with Hulk Hogan, who had a sex tape released without his permission. Though it was real, it left a lasting impression on a given audience, and it tarnished his reputation (further), and he successfully sued because, iirc, the video was “leaked” without his consent.

[+] OliveMate|2 years ago|reply
It's pretty abhorrent to practically digitally unclothe someone (unless they fully consent to it), and more times than not the content produced by this will be shared online to countless other people.

The thing is, you cannot guarantee that everyone will know it's fake, and when you weaponise it against people who can't distinguish it there'll be consequences.

Currently we see people found to be doing Adult Content online fired from their jobs, but what if a technically-blind boss gets sent a ton of deepfakes of an employee and is falsely told they do content like that on the side? What if a doubting partner suddenly receives deepfakes of their partner with someone else? What if your political rival starts spreading deepfakes with wildly rabid claims about yourself which the media picks up and reports on – bullshit has spread around the world by the time the truth has tied its shoelaces.

Think about the absolute inane content people believe on Facebook. Think about the misinformation that gets spread over Twitter just because the fake information would be funny. Think about the stuff you've seen online which turned out not to be so real.

Even if the images generated aren't real, the subject of them can't convince the world they're not, and there's no way for them to stop the violation of their self-image.

[+] ehnto|2 years ago|reply
Defamation, identity theft, abuse of individuals directly by circulating porn of them in their communities. Fake or not, it's very impactful. It's not an ideological issue but a very practical one, it's already causing harm.

You might be thinking of celebrities where the idea that they have done porn is so far fetched that it's obvious it would be fake. But what about minor personalities like youtubers? It could tank a career if it's unclear something is real or not. Just like a false assault allegation, it will still cause a lot if harm even if the truth prevails.

Lastly, on an individual level, people are going to get hurt by sociopaths circling imagery of coworkers or whatever, it's messy human social business and the pragmatic person might imagine the truth is all that matters, but it definitely isn't. You can ruin someone's social life and mental health with mere rumours, let alone what will look like proof of ireputable activity.

[+] testtestabcdef|2 years ago|reply
Just a few more months then people will stop talking about it, as usual.
[+] Madmallard|2 years ago|reply
It's already good enough to not be really obviously fake.
[+] Cthulhu_|2 years ago|reply
People won't realize it's fake... and people won't care either. You struggle to see the harm because you haven't been a victim, and I presume you're not a woman either. So yeah, sorry to get personal but you're missing something.
[+] nperez|2 years ago|reply
It seems like a really, really difficult task to enforce any laws that might develop around this. This isn't like paparazzi nudes that get out every few years - it's just math, and thousands could be generated overnight with little compute.

I don't know what the laws should be, but I think there are some serious limitations on what's possible/practical at this point.

[+] Cthulhu_|2 years ago|reply
There ARE laws against this kind of thing, defamation and whatnot. It's the same as editing a face onto someone else's (naked) body. No new laws are needed; the method has changed, not the intent nor the crime / violation.
[+] __loam|2 years ago|reply
There are a lot of technically possible things you can do with a computer that are currently illegal. Just because you are able to do something doesn't mean there won't eventually be legal consequences, even if enforcement is impractical.
[+] unglaublich|2 years ago|reply
Pornographic image editing is "THE danger of deepfakes"? And somehow it's a problem for females (sic)?

Why this focus on pornography?

Pornographic image editing has been around forever. Deepfakes is just another tool in the toolbox. This article fails to see the wider consequences of deepfakes outside of pornography as described by the Wiki page cited in the article.

I'd say the real danger is the potential to destabilize society by faking authority and spreading misinformation to manipulate people for financial or political goals. Pornography is a tiny part of that.

[+] Der_Einzige|2 years ago|reply
Go on Civit.ai (the hub of custom diffusion models besides huggingface), make an account and login. You may notice that well over 60% of all models are porn, especially Waifu porn.

Even without logging in the majority of models market themselves with cute/scantly clad women. That’s why they focus on porn, because that’s what everyone is doing with AI image generators.

[+] Cthulhu_|2 years ago|reply
But deepfakes are only one means to an end; edited videos / photos and fake news / misinformation have been around for forever.

Although deepfakes, especially video + audio, can be more convincing than the older techniques.

[+] chrisjj|2 years ago|reply
> As evident from the comparison, the above Deepfake image is virtually indistinguishable from the original

We have here a case of the meaning of "indistinguishable" being deeply deepfaked, given the deepfake image shows naked v. the original clothed.

[+] chrisjj|2 years ago|reply
> As evident from the comparison, the above Deepfake image is virtually indistinguishable from the original

It seems the meaning of "indistinguishable" is being deeply deepfaked, given the deepfake image shows naked v. the original clothed.

[+] ggm|2 years ago|reply
CSAM faked with real adult peoples faces. What's a law enforcement agent to do? Ignore it?

Weaponised pornfakes could get people killed. Pizzagate happened.

[+] MrStonedOne|2 years ago|reply
There is nothing inclusive about perpetuating gender stereotypes.

Men are allowed to feel the same sense of privacy and intimacy over one's own body that we afford women, and pretending only women get harmed by the rampant abuse of photoshop on steroids doesn't do anybody any good.

[+] __loam|2 years ago|reply
Neither does pretending most of the victims of this bullshit aren't mostly women.
[+] WentFullRetard|2 years ago|reply
Understanding the danger of photoshopped images

It's not like this is some new occurrence. Just made easier by what's out. Anybody that will take the time to do these things would put the time into it through another medium. Just more AI fearmongering.

And of course... only mentioning women as victims. Right.