You know, I thought that deepfake videos would be politically weaponized when I first heard about them. However, after doing more thinking on this, we have had photoshop for 30 years already! We see photoshopped images all the time and while some people can be fooled, many others remain skeptical of an image then try to verify it hasn't been altered. I don't think photoshopping has really been a big problem yet, which makes me think that deepfakes won't be one either because it is fundamentally the same kind of deception but in video form.
All of these things make it easier to mass-produce bullshit at low cost.
I'm pretty sure I know people who have been convinced by meme quotes. A headshot of a politician they don't like, with text overlaid, which they never said. People are outraged! And never bother to inspect the source.
Anything that makes it easier to lie about what someone said or did, or makes it harder to disprove... They're all politically weaponized, already.
Yeah the bigger problem is taking video footage of a politician, and then going frame by frame to find the most unflattering possible depiction of them (usually right after a cough or a sneeze) so you can use it to "support" your trash click bait headline. No deep fakes needed - you can make anyone look like a raving lunatic if you take the frame right before a sneeze.
Photoshopping is done by hand and generally has mistakes. Good photoshops are still believed.
Videos are being altered by machines. They’re being optimized for natural looking results. It’s harder to notice small mistakes when frames are going by at 24FPS vs poring over a static image for 30 seconds until you finally notice the one region with mismatched shadows or odd clipping.
Mostly people seem to be sceptical of things they already don’t believe. If someone repeats something you believe, how much research are you going to do. So photoshopped images that reinforce your beliefs slip by, and the one that challenge you, you catch. Or worse, the ones that challenge you get labeled as photoshopped regardless of their provenance.
You don't think that photoshopping has really been a problem and you think people remain skeptical?
I guess if you think that epidemics regarding images of male/female body image, body dismorphia, self-harm, anxiety, and using celebrities to sell products aren't connected to it, but I've found exactly the opposite.
I love photography and I am utterly unable to talk to non-photographers or convince them about what happens in the production of most images they see in most forms of commercial media.
It goes something like this:
"Hey ACowAdonis, how much of that photo do you think was retouched?"
looks at photo
"All of it".
"All of it? What do you mean?"
"I mean all of it."
"But that's Reese Witherspoon! (or insert popular celebrity here)"
"Yep, and you can see how her eyes have been adjusted, her skins been adjusted, they've changed the shape of her arm, taken a few pounds off the mid section, increased the boob size, changed the colour of her hair...and i'm pretty sure that's not her hand".
"Nah, you crazy..."
"You want crazy...pretty much every photo in every fashion magazine and every media item involving that celebrity has been adjusted to a similar extent"
The best way to fool someone isn't to do an indistinguishable Photoshop job. It's to do a passable-enough fake of something the person wanted to believe anyway.
Here's how it will work, someone will make a deepfake of a political opponent and then publish it on a forum where like minded people gather using a dummy account.
Other dummy accounts will take the deepfake and start making a narrative around it, sending chain emails to their real world contacts.
Real world contacts will start passing around deepfake chain mail they were sent.
Some of these emails will take the deepfake as true, some will talk of it as being a funny parody, but "funny because it's true" anyway.
Major news organizations can now address the issue as news because people are passing it around, maybe it will be something like 'Well Bob, I think the X have a real image problem on their hand, if the video is true or not..' You don't mean to say you think it's true!? "I didn't say that Bob, I'm frankly not qualified to judge and I haven't done any research what I'm worried about here is that there is a perception that it is true or if it is not exactly true in this particular instance that it might be true, and that is what I mean by a real image problem"
The REAL issue here will not be the fake videos themselves. They will cause many messes, but the real issue is an acceleration of what we see today: a loss of trust in information and in particular the established media. More societal rift, easy to dismiss any negative news about your favourite politician/rapper/.. as fake video; more difficult court cases even where there's video evidence, ..... Terrifying.
This scepticism is itself a problem. There's a whole branch of philosophy that claims that the objective truth is impossible to know. With deepfakes that's even more true, and might drive a lot of people into despair and apathy.
I don't think that fakes (photoshop, deepfakes, whatever) need to be absolutely believable to be effective. The long-game purpose is to erode trust in institutions, media, politicians, etc. Fakes accomplish this goal by being just believable and just frequent enough that more and more people start deciding to believe whatever it is they want to believe because "who knows what the real truth is!".
I agree, it was fake news articles on Facebook that spread false information but it took desperately ignorant people to believe it for the consequent chaos to ensue. So while deepfake videos are scary, I think what we have to really worry about is the deep ignorance of the voters.
Writers have been able to write nonsense for a long time... and photo manipulation we've gotten quite used to. All we do is add video to the category of things that might be lies, and so need independent verification.
Skepticism is good and healthy, and verification in the age of Google isn't that hard.
You can trust that if the NY Times or CBS publishes a video, they verified its authenticity, or else will be publishing a big retraction within a few days that will also make the news because it's so rare.
Whether your uncle sends you a random photo or a video of a politician that seems too exaggerated or weird or unbelievable... you assume it might be manipulated... as you already do now. Making Nancy Pelosi seem drunk didn't take a deepfake, just slowing it down.
It's not any kind of big change. Just applying the same skepticism we already automatically apply to so many other things.
I remember there was some oil companies-backed anti-Tesla propaganda image a while ago showing the "environmental disaster a lithium mine creates," which went viral for a bit. It was, I think, a tar sands mine.
There's no way deepfake videos won't make the propaganda situation worse, at least for a while.
People know about photo manipulations and get suspicious because we've had Photoshop for thirty years (and analog photomanipulation even longer) and see them all the time. This wasn't always true. When photography was new, manipulations that wouldn't fool anyone today were taken as proof by many people. See for example https://en.wikipedia.org/wiki/Cottingley_Fairies
Photoshop has been used for years to successfully fool millions of men and women that consume magazines showing people with smooth skin and sexy bodies.
I think the problem is that if we can’t trust video then there is really nothing visual left we can trust. Until now you could at least trust video recordings to some degree. Not sure if that a good thing or not.
I find it worrying that only now people start to be sceptical about visual information. There is a huge difference between real world and the framed and curated view photographer or documentarist gives you. If you go to school to learn about this stuff it's mostly about how to convey your view through these tools and the ethical implications of it.
I think people vastly underestimate how much editing and framing change the perceived truth of what happened. It is more subtle than manipulating the contents of video, but I think it can be in many ways more effective as most of this stuff bypasses your cognition and is not straight up lying.
It feels the same as in written news changing the quote vs. changing text around the quote.
I think we would be better of looking at video like it was picture drawn or text written by someone. It's an artistic rendition of the events.
The danger is not in false positives, but false negatives. The very existence of this kind of things erode trust and sow paranoia.
A simple morph cut in a John Pilger interview of Assange made a sizeable portion of nutjobs believe Assange has been long dead. Don't think this kind of behaviour can't eventually extend to the mainstream.
The "Ethical concerns" section in the article feels like a punt. The author quoting "this technology is really about better storytelling" is aspirational -- the technology's story will be written by those who use it, and you can bet people will use this maliciously.
I suspect in the not too distant future we'll need a way to produce provably true videos. I'm thinking something like the subject, a politician giving a press conference for example, carries something that emits a signal that the cameras encode into the video in a way that any alterations could be detected, something like a cryptographic signature. I don't really know enough about cryptography time be sure how / if it would work.
I had this idea that devices which record content like images or video should have an unforgeable key internal to their hardware, like we have with PGP / GPG. Content that comes from the device would be signed, and allow users to validate whether it originated unmodified from the hardware source.
Granted, derived content will fail validation, but it will motivate tracking down the original, until validation can be performed. Maybe you can take pictures of fake imagery printed onto large high-def paper, but at least you eliminate one stage in the process...
Honestly, we should not trust digital content these days.
> unforgeable key internal to their hardware, like we have with PGP / GPG
PGP involves a private key, and if you have the private key you can "forge" any message. If you put the key in hardware, it can be read by an adversary with access to a powerful microscope.
> Content that comes from the device would be signed
Wonder if that is possible. One can always convert a picture to a 2-D array of RGB values, so signature can't be in the video (or image) file container. So it has to be a watermark of a kind. If the algorithm is known, it's interesting if the signature can be unforgeable. If algorithm isn't known, then other issues (like with DeCSS) can appear.
Alternatively, we need to see how well these algorithms can fake videos from different angles in a consistent manner. If we have enough enough people recording, it may just be enough to prove something is not a fake.
In that case, video editing softwares and materials should also conceal a private key, in order to trace the origin of manipulation.
The whole chain of video production should be signed in order to trace filming and edition altogether and, all intermediary signatures from each stage of production process contained in a final signature.
That may be a business opportunity or at least (may be preferably) an interresting open source project.
Could it be something even more low-tech? Kind of like those faint yellow dots that all printers must print. Maybe a distinct pattern of pixel coloration that a person wouldn’t even be aware of unless they were looking for it.
Everyone seems to think of nefarious uses, but I can't wait for this tech to appear in video calls, combined with translation. This could enable two people without a common language to have a conversation while appearing to each other as native speakers of their respective languages.
A place where I don't think it will be used much is actual facing-the-camera-talking-head content. Something we have learned from YouTubers is that audiences don't care if there are discontinuous cuts during a monologue. YouTubers don't try to pretend they did it all in one take, and will happily edit their video as if editing text. The cuts are obvious in both the audio and video. And still it works.
Seems really cool but I wonder how well it will handle a case where you want to swap a phrase for a phrase, but have the new phrase have a "human specific" emphasis or variant to it.
Example: "That was a short trip" vs "That was a reaaaaaalllly long trip".
Language is so much more than words. When you deliver the variant message, your whole facial expression might change. So much would get lost if that doesn't carry over. Your facial expression and tone in that context also completely changes the meaning from you enjoyed the long trip to not enjoying it, but how can a machine know which one to pick.
It's strange to me that people are so concerned about these deep fakes when the National Enquirer has been around for so long. It's been easy to lie to people in mass for awhile now. I don't think this changes the number of people that are open to these suggestions, I think people in general are smarter than a lot of people give them credit for.
Science Fiction author Greg Egan wrote a novel called Distress[0] where the main character is a science journalist who makes documentaries. He uses software exactly like this. The book was published in 1995. It's a very good book and I highly recommended it and basically any other book written by Egan. (My personal favorite is probably "Diaspora" followed closely by "Permutation City".)
This tech allows the state or corporations to quietly adjust the historical record of their representatives words and statements to fit their ambitions at any given point.
All human evidence rests upon the shaky foundation of "because I believe its true", at the bottom of which rests the shaky foundation of your personal experiences. Don't believe me? Just ask a schizophrenic how hard it is to disbelieve your own experience.
Reminds me of the HP Lovecraft novel "The Call of Cthulu" Page 1, Paragraph 1:
> The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.
Interested to see if counter-measures begin to be deployed in order to make a deep fake more difficult and buy time. Incorporating dynamic backgrounds and body gestures like touching one's face while talking.
I actually love this ongoing cat and mouse game. I don't follow the events in this field keenly so I don't know if it exists, but the challenge is to find antidote to this concoction, created by mad scientists just for sake of science, that will be weaponized anytime now.
[+] [-] jalgos_eminator|6 years ago|reply
[+] [-] VikingCoder|6 years ago|reply
I'm pretty sure I know people who have been convinced by meme quotes. A headshot of a politician they don't like, with text overlaid, which they never said. People are outraged! And never bother to inspect the source.
Anything that makes it easier to lie about what someone said or did, or makes it harder to disprove... They're all politically weaponized, already.
Look at the "drunk pelosi" video.
[+] [-] umvi|6 years ago|reply
[+] [-] fiblye|6 years ago|reply
Videos are being altered by machines. They’re being optimized for natural looking results. It’s harder to notice small mistakes when frames are going by at 24FPS vs poring over a static image for 30 seconds until you finally notice the one region with mismatched shadows or odd clipping.
[+] [-] siidooloo|6 years ago|reply
[+] [-] ACow_Adonis|6 years ago|reply
I guess if you think that epidemics regarding images of male/female body image, body dismorphia, self-harm, anxiety, and using celebrities to sell products aren't connected to it, but I've found exactly the opposite.
I love photography and I am utterly unable to talk to non-photographers or convince them about what happens in the production of most images they see in most forms of commercial media.
It goes something like this:
"Hey ACowAdonis, how much of that photo do you think was retouched?"
looks at photo
"All of it".
"All of it? What do you mean?"
"I mean all of it."
"But that's Reese Witherspoon! (or insert popular celebrity here)"
"Yep, and you can see how her eyes have been adjusted, her skins been adjusted, they've changed the shape of her arm, taken a few pounds off the mid section, increased the boob size, changed the colour of her hair...and i'm pretty sure that's not her hand".
"Nah, you crazy..."
"You want crazy...pretty much every photo in every fashion magazine and every media item involving that celebrity has been adjusted to a similar extent"
"Nah mate, you're having me on. You're nuts."
[+] [-] newsbinator|6 years ago|reply
I see this technology as no different.
[+] [-] bryanrasmussen|6 years ago|reply
Other dummy accounts will take the deepfake and start making a narrative around it, sending chain emails to their real world contacts.
Real world contacts will start passing around deepfake chain mail they were sent.
Some of these emails will take the deepfake as true, some will talk of it as being a funny parody, but "funny because it's true" anyway.
Major news organizations can now address the issue as news because people are passing it around, maybe it will be something like 'Well Bob, I think the X have a real image problem on their hand, if the video is true or not..' You don't mean to say you think it's true!? "I didn't say that Bob, I'm frankly not qualified to judge and I haven't done any research what I'm worried about here is that there is a perception that it is true or if it is not exactly true in this particular instance that it might be true, and that is what I mean by a real image problem"
[+] [-] soulofmischief|6 years ago|reply
And their fanbases will listen because it's easier than accepting their idol could be a bad actor.
[+] [-] pergadad|6 years ago|reply
[+] [-] mamon|6 years ago|reply
[+] [-] el_benhameen|6 years ago|reply
[+] [-] loceng|6 years ago|reply
[+] [-] 1PlayerOne|6 years ago|reply
[+] [-] nihonde|6 years ago|reply
Next up: the person you’re sitting across from at dinner.
[+] [-] crazygringo|6 years ago|reply
Writers have been able to write nonsense for a long time... and photo manipulation we've gotten quite used to. All we do is add video to the category of things that might be lies, and so need independent verification.
Skepticism is good and healthy, and verification in the age of Google isn't that hard.
You can trust that if the NY Times or CBS publishes a video, they verified its authenticity, or else will be publishing a big retraction within a few days that will also make the news because it's so rare.
Whether your uncle sends you a random photo or a video of a politician that seems too exaggerated or weird or unbelievable... you assume it might be manipulated... as you already do now. Making Nancy Pelosi seem drunk didn't take a deepfake, just slowing it down.
It's not any kind of big change. Just applying the same skepticism we already automatically apply to so many other things.
[+] [-] mtgx|6 years ago|reply
There's no way deepfake videos won't make the propaganda situation worse, at least for a while.
[+] [-] ruytlm|6 years ago|reply
It will be interesting to see if such tools are developed for video as well.
[+] [-] adrianN|6 years ago|reply
[+] [-] goshx|6 years ago|reply
[+] [-] maxxxxx|6 years ago|reply
[+] [-] empath75|6 years ago|reply
[+] [-] ollifi|6 years ago|reply
I think people vastly underestimate how much editing and framing change the perceived truth of what happened. It is more subtle than manipulating the contents of video, but I think it can be in many ways more effective as most of this stuff bypasses your cognition and is not straight up lying.
It feels the same as in written news changing the quote vs. changing text around the quote.
I think we would be better of looking at video like it was picture drawn or text written by someone. It's an artistic rendition of the events.
[+] [-] keiru|6 years ago|reply
A simple morph cut in a John Pilger interview of Assange made a sizeable portion of nutjobs believe Assange has been long dead. Don't think this kind of behaviour can't eventually extend to the mainstream.
[+] [-] mendelbot|6 years ago|reply
[+] [-] notaboutdave|6 years ago|reply
[+] [-] vernie|6 years ago|reply
[+] [-] okrad|6 years ago|reply
Perhaps the malicious use cases are more obvious due to how trustworthy a video can appear.
[+] [-] falcolas|6 years ago|reply
Think of the impact of this on dubbing movies between languages. This seems like an incredible tool.
Of course, we can’t just forget about deepfakes and such, but this particular usecase kind of excites me.
[+] [-] imgabe|6 years ago|reply
[+] [-] eatbitseveryday|6 years ago|reply
Granted, derived content will fail validation, but it will motivate tracking down the original, until validation can be performed. Maybe you can take pictures of fake imagery printed onto large high-def paper, but at least you eliminate one stage in the process...
Honestly, we should not trust digital content these days.
[+] [-] dooglius|6 years ago|reply
PGP involves a private key, and if you have the private key you can "forge" any message. If you put the key in hardware, it can be read by an adversary with access to a powerful microscope.
[+] [-] avmich|6 years ago|reply
Wonder if that is possible. One can always convert a picture to a 2-D array of RGB values, so signature can't be in the video (or image) file container. So it has to be a watermark of a kind. If the algorithm is known, it's interesting if the signature can be unforgeable. If algorithm isn't known, then other issues (like with DeCSS) can appear.
[+] [-] pavanky|6 years ago|reply
[+] [-] w1nst0nsm1th|6 years ago|reply
The whole chain of video production should be signed in order to trace filming and edition altogether and, all intermediary signatures from each stage of production process contained in a final signature.
That may be a business opportunity or at least (may be preferably) an interresting open source project.
[+] [-] elliekelly|6 years ago|reply
[+] [-] DoctorOetker|6 years ago|reply
[+] [-] mopsi|6 years ago|reply
[+] [-] scott_s|6 years ago|reply
A place where I don't think it will be used much is actual facing-the-camera-talking-head content. Something we have learned from YouTubers is that audiences don't care if there are discontinuous cuts during a monologue. YouTubers don't try to pretend they did it all in one take, and will happily edit their video as if editing text. The cuts are obvious in both the audio and video. And still it works.
[+] [-] nickjj|6 years ago|reply
Example: "That was a short trip" vs "That was a reaaaaaalllly long trip".
Language is so much more than words. When you deliver the variant message, your whole facial expression might change. So much would get lost if that doesn't carry over. Your facial expression and tone in that context also completely changes the meaning from you enjoyed the long trip to not enjoying it, but how can a machine know which one to pick.
[+] [-] MichaelEstes|6 years ago|reply
[+] [-] doctoboggan|6 years ago|reply
[0]: https://www.goodreads.com/book/show/19328253-distress
[+] [-] kingkawn|6 years ago|reply
[+] [-] program_whiz|6 years ago|reply
Reminds me of the HP Lovecraft novel "The Call of Cthulu" Page 1, Paragraph 1:
> The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.
[+] [-] dpau|6 years ago|reply
[+] [-] lainon|6 years ago|reply
[+] [-] linux_devil|6 years ago|reply
[+] [-] jonplackett|6 years ago|reply
[+] [-] hanniabu|6 years ago|reply
[+] [-] vinayms|6 years ago|reply