My fear is less about people being “duped” by a fake video and more that fake videos will serve as feedback loops for misguided or false beliefs that people already “cherish” and “love”. Most will make little effort to research the legitimacy of a video that agrees with their current beliefs, but those beliefs will probably be strongly reinforced by fake videos.
I don't think it matters. Those people are already in an intellectually closed pocket universe. People overestimate the extent to which universal consensual reality exists or has ever existed.
The first line of defense is education. Fundamentally, we have to make the case for why we know what we know. This is why K-12 exists, although the availability of effective primary and secondary education remains a major issue.
The next line of defense is social interaction. Most people will have to leave their bubbles to have any sort of upward mobility and ability to steer society. There will always be cynical people who exploit constituencies of deceived people to gain power, but many others eventually defect.
We have little reason to believe that this is a long-term equilibrium, but it's the story of the past 500 years of history, ever since the printing press created decentralized mass media.
> Most will make little effort to research the legitimacy of a video
This is already happening. People are regularly editing videos out of context to fit a narrative. It can't really get any worse when the media's integrity is already hitting rock bottom.
Some people still don't think humans landed on the moon. This is not a battle over facts. It's a war.
The real problem is that people will torture a fact, like a PoW, until it'll tell any story they want it to tell. Even if that story has no basis except for delusion of the torturers.
That can be avoided by video content uploaders verifying their source content or a plugin running in your browser noticing you about detected fake videos. Sort of like firewall against deepfakes.
I think people overestimate the concern of fake videos. Consider photos for comparison. There have been fake photos of well known people for decades online, many of which are indistinguishable from reality. It doesn't lead to much confusion or issues in our everyday life. We just assume every noteworthy image is fake unless it comes from, or is cleared by, a credible source. The same will apply to video.
> We just assume every noteworthy image is fake unless
Who is we? How do you know this? Citation needed. I think it is a very small portion of the population who actively operates under this assumption for photos. I would guess 5-15% but I don't know. Surely it is not most people though, or everyone.
Just because you/your friends do something doesn't mean other people do, or even most people do. I'd guess that for a great number of things that it would often be the opposite for technical people; often the things we do are things most people don't do.
I think people underestimate the coming confusion from video.
> It doesn't lead to much confusion or issues in our everyday life.
I mean, I think it does. I think it leads to massive issues in society where people don't know what is real or not without even knowing it. Magazine photos of people are well-known to be touched-up at a minimum, but how many people in a population actively think about that when the look at the cover?
I would posit that our society has been heavily damaged by the proliferation of fake photos.
Like fake photos, these will be most highly leveraged among the undereducated. Websites like Snopes will probably help serve as an outside point of reference in many cases; however some people are not really open to criticism of whatever is their current mental model and can just as easily see another point of reference as opposition propaganda as reliable analysis. Alex Jones is an example of one with this psychology--he has such a strong trust in his original sensory-experience-analysis-system that you're better off taking other approaches than making a frontal assault on the citadel of this subjective information-interpretation experience, which is so highly knotted up with his sense of self and personal creativity.
Fortunately this kind of lopsided/over-weighted psychological subsystem will never speak to everyone, and humans are as a group becoming more resilient in the face of such imbalance. The internet has in many ways been extremely helpful in serving as a sort of blowoff valve for psychological gifts that have spun out of balance.
>We just assume every noteworthy image is fake unless it comes from, or is cleared by, a credible source. The same will apply to video.
As the inauguration crowd size debate proved in January of 2018, there are problems now with even credible sources on trivial things.
That's the 1-2 punch that I think might be more problematic. OK maybe you're right and people assume it is fake unless it is from a trustworthy source, but the trustworthy sources have dramatically shifted in the last--oh I'd say 18 years.
Sure - so the effort switches to attacking the credibility of sources. That just pushes people further and further towards only believing the news sources they already believe, no matter how far away from a neutral and reasonable interpretation of the facts they have gone.
I’m going to have to ask for a citation on photoshopped imaged not being taken seriously. We routinely see people being duped by them on twitter, but it’s even more insidious in the form of Facebook advertisements that aren’t publically shared for people to debunk.
I am curios what things the la times is doing with the data it collects that it still after all these months has not been able to offer a GDPR compliant page.
> Now imagine a phony video of North Korean dictator Kim Jong Un announcing a missile strike. The White House would have mere minutes to determine whether the clip was genuine and whether it warranted a retaliatory strike.
Really? Hard hitting journalism everybody. "Mere minutes", it will be warranted when they see proof of a launch, do you really think the government just decides based on a video when we have much more surefire ways to determine these things!?
Imagine somebody interested in war starts realistically faking radar etc. signatures, perhaps supported by tiny planted chips in army infrastructure. Could be fun.
It could be a video disguised as an insider spy tip. It wouldn't cause a retaliatory strike but it certainly would cause trouble and loss of money and time.
I think part of the problem is that we watch too many movies with CGI and we trained ourselves to ignore it.
At this point in time face-replace videos can be relatively easily spotted if you watch them in high quality. At least the ones I've seen demonstrated.
But overall, videos are getting either to fake. At the same time, the weren't bullet-proof in the past either. The bad part is that now you can do a lot of it in near real time. So you can change something in a live report.
I see a growing need for public services that cryptographically timestamp files.
Also, I would like to see research in using machine learning to spot fake videos.
Our team at Mirage is working on solving exactly the same problem. Our current prototype allows users to detect deepfakes in YouTube videos. Currently very early stage and any feedback is greatly appreciated. Max video length 60 seconds.
Link to demo: https://deepbuster.com/
I think people overestimate the bad consequences and underestimate the good consequences of such things.
1. I see amazing potential in AI based content creation. Imagine a world where you can have all the music/movies you want, personalized to your specific taste. I would love to have an AI watch Avatar The Last Airbender and invent a few more seasons for me (find out what happened to Zuko's mom :))
2. It is true that we should be more careful about what content is genuine vs fake, but cryptography has you covered, anyone can easily digitally sign the content they create with their private key and be able to prove the authenticity of their content.
3. One thing to note is that AI cannot be used to distinguish reals from fakes because the fakes are generated (using GANs) precisely so they can't be distinguished from the reals.
There is going to be a period of time during which videos are easy to fake but people are still convinced they are real. This will lead to a lot of fake news, as well as wrongful criminal convictions.
The thing about video evidence is that it should be treated like witness testimony should - verified with other evidence. Manipulative editing can have similar effects like the James O'Keefe's infamous Acorn libel. If someone makes a deep-fake of Donald Trump shooting someone on main street with a rifle the lack of actual blood, 9-11 calls in the area or similar would give it away as a fake even if technically perfect.
[+] [-] dreaming1234|7 years ago|reply
Pinscreen is getting sued by their former VP of Engineering for faking their results and for assault and battery [1].
[1] http://sadeghi.com/dr-iman-sadeghi-v-pinscreen-inc-et-al/
[+] [-] dpwm|7 years ago|reply
[+] [-] escapecharacter|7 years ago|reply
[+] [-] jarsin|7 years ago|reply
[+] [-] davidgh|7 years ago|reply
[+] [-] acjohnson55|7 years ago|reply
The first line of defense is education. Fundamentally, we have to make the case for why we know what we know. This is why K-12 exists, although the availability of effective primary and secondary education remains a major issue.
The next line of defense is social interaction. Most people will have to leave their bubbles to have any sort of upward mobility and ability to steer society. There will always be cynical people who exploit constituencies of deceived people to gain power, but many others eventually defect.
We have little reason to believe that this is a long-term equilibrium, but it's the story of the past 500 years of history, ever since the printing press created decentralized mass media.
[+] [-] ekianjo|7 years ago|reply
This is already happening. People are regularly editing videos out of context to fit a narrative. It can't really get any worse when the media's integrity is already hitting rock bottom.
[+] [-] ggggtez|7 years ago|reply
The real problem is that people will torture a fact, like a PoW, until it'll tell any story they want it to tell. Even if that story has no basis except for delusion of the torturers.
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] heiki|7 years ago|reply
[+] [-] Guest9812398|7 years ago|reply
[+] [-] jacobsheehy|7 years ago|reply
Who is we? How do you know this? Citation needed. I think it is a very small portion of the population who actively operates under this assumption for photos. I would guess 5-15% but I don't know. Surely it is not most people though, or everyone.
Just because you/your friends do something doesn't mean other people do, or even most people do. I'd guess that for a great number of things that it would often be the opposite for technical people; often the things we do are things most people don't do.
I think people underestimate the coming confusion from video.
> It doesn't lead to much confusion or issues in our everyday life.
I mean, I think it does. I think it leads to massive issues in society where people don't know what is real or not without even knowing it. Magazine photos of people are well-known to be touched-up at a minimum, but how many people in a population actively think about that when the look at the cover?
I would posit that our society has been heavily damaged by the proliferation of fake photos.
[+] [-] themodelplumber|7 years ago|reply
Fortunately this kind of lopsided/over-weighted psychological subsystem will never speak to everyone, and humans are as a group becoming more resilient in the face of such imbalance. The internet has in many ways been extremely helpful in serving as a sort of blowoff valve for psychological gifts that have spun out of balance.
[+] [-] 27182818284|7 years ago|reply
As the inauguration crowd size debate proved in January of 2018, there are problems now with even credible sources on trivial things.
That's the 1-2 punch that I think might be more problematic. OK maybe you're right and people assume it is fake unless it is from a trustworthy source, but the trustworthy sources have dramatically shifted in the last--oh I'd say 18 years.
[+] [-] crwalker|7 years ago|reply
[+] [-] pjc50|7 years ago|reply
[+] [-] mattnewton|7 years ago|reply
[+] [-] albertgoeswoof|7 years ago|reply
Do fake videos add any more credibility to this kind of misinformation?
[+] [-] gattilorenz|7 years ago|reply
[+] [-] sschueller|7 years ago|reply
[+] [-] MickerNews|7 years ago|reply
Remember when the web just worked?
[+] [-] S-E-P|7 years ago|reply
Really? Hard hitting journalism everybody. "Mere minutes", it will be warranted when they see proof of a launch, do you really think the government just decides based on a video when we have much more surefire ways to determine these things!?
[+] [-] bitL|7 years ago|reply
[+] [-] muthdra|7 years ago|reply
[+] [-] colllectorof|7 years ago|reply
At this point in time face-replace videos can be relatively easily spotted if you watch them in high quality. At least the ones I've seen demonstrated.
But overall, videos are getting either to fake. At the same time, the weren't bullet-proof in the past either. The bad part is that now you can do a lot of it in near real time. So you can change something in a live report.
I see a growing need for public services that cryptographically timestamp files.
Also, I would like to see research in using machine learning to spot fake videos.
[+] [-] hiccuphippo|7 years ago|reply
[+] [-] kosei|7 years ago|reply
"With further deep-learning advancements, especially on mobile devices, we'll be able to produce completely photoreal avatars in real time."
What "deep learning advancements" is he referring to?
Doesn't surprise me that he's being sued by his former VP of Engineering for fabricating the truth (and assault and battery) at all.
[+] [-] cmroanirgo|7 years ago|reply
https://www.youtube.com/watch?v=Fm8FJ8la2VU
[+] [-] liftbigweights|7 years ago|reply
https://en.wikipedia.org/wiki/Censorship_of_images_in_the_So...
Even FDR's photos were edited to hide his paralysis early on.
As long as we have "histories" of videos as we do of photos, can't we reasonably compare them?
[+] [-] heiki|7 years ago|reply
[+] [-] tsuberim|7 years ago|reply
1. I see amazing potential in AI based content creation. Imagine a world where you can have all the music/movies you want, personalized to your specific taste. I would love to have an AI watch Avatar The Last Airbender and invent a few more seasons for me (find out what happened to Zuko's mom :))
2. It is true that we should be more careful about what content is genuine vs fake, but cryptography has you covered, anyone can easily digitally sign the content they create with their private key and be able to prove the authenticity of their content.
3. One thing to note is that AI cannot be used to distinguish reals from fakes because the fakes are generated (using GANs) precisely so they can't be distinguished from the reals.
[+] [-] twblalock|7 years ago|reply
[+] [-] Hoasi|7 years ago|reply
[+] [-] EGreg|7 years ago|reply
[+] [-] narrator|7 years ago|reply
[+] [-] anoplus|7 years ago|reply
[+] [-] yesenadam|7 years ago|reply
[+] [-] acjohnson55|7 years ago|reply
[+] [-] imhoguy|7 years ago|reply
[+] [-] Nasrudith|7 years ago|reply
[+] [-] mgoetzke|7 years ago|reply
[+] [-] gcb0|7 years ago|reply
[+] [-] drummyfish|7 years ago|reply