top | item 18158170

As fake videos become more realistic, seeing shouldn't always be believing

145 points| geekdidi | 7 years ago |latimes.com

78 comments

order
[+] dreaming1234|7 years ago|reply
> With more time, Pinscreen, the Los Angeles start-up behind the technology, believes its renderings will become so accurate they will defy reality.

Pinscreen is getting sued by their former VP of Engineering for faking their results and for assault and battery [1].

[1] http://sadeghi.com/dr-iman-sadeghi-v-pinscreen-inc-et-al/

[+] dpwm|7 years ago|reply
The linked piece is a surprisingly compelling read.
[+] jarsin|7 years ago|reply
HAHA welcome to the real world outside of the Ivory Tower of Google. Everything is smoke and mirrors.
[+] davidgh|7 years ago|reply
My fear is less about people being “duped” by a fake video and more that fake videos will serve as feedback loops for misguided or false beliefs that people already “cherish” and “love”. Most will make little effort to research the legitimacy of a video that agrees with their current beliefs, but those beliefs will probably be strongly reinforced by fake videos.
[+] acjohnson55|7 years ago|reply
I don't think it matters. Those people are already in an intellectually closed pocket universe. People overestimate the extent to which universal consensual reality exists or has ever existed.

The first line of defense is education. Fundamentally, we have to make the case for why we know what we know. This is why K-12 exists, although the availability of effective primary and secondary education remains a major issue.

The next line of defense is social interaction. Most people will have to leave their bubbles to have any sort of upward mobility and ability to steer society. There will always be cynical people who exploit constituencies of deceived people to gain power, but many others eventually defect.

We have little reason to believe that this is a long-term equilibrium, but it's the story of the past 500 years of history, ever since the printing press created decentralized mass media.

[+] ekianjo|7 years ago|reply
> Most will make little effort to research the legitimacy of a video

This is already happening. People are regularly editing videos out of context to fit a narrative. It can't really get any worse when the media's integrity is already hitting rock bottom.

[+] ggggtez|7 years ago|reply
Some people still don't think humans landed on the moon. This is not a battle over facts. It's a war.

The real problem is that people will torture a fact, like a PoW, until it'll tell any story they want it to tell. Even if that story has no basis except for delusion of the torturers.

[+] heiki|7 years ago|reply
That can be avoided by video content uploaders verifying their source content or a plugin running in your browser noticing you about detected fake videos. Sort of like firewall against deepfakes.
[+] Guest9812398|7 years ago|reply
I think people overestimate the concern of fake videos. Consider photos for comparison. There have been fake photos of well known people for decades online, many of which are indistinguishable from reality. It doesn't lead to much confusion or issues in our everyday life. We just assume every noteworthy image is fake unless it comes from, or is cleared by, a credible source. The same will apply to video.
[+] jacobsheehy|7 years ago|reply
> We just assume every noteworthy image is fake unless

Who is we? How do you know this? Citation needed. I think it is a very small portion of the population who actively operates under this assumption for photos. I would guess 5-15% but I don't know. Surely it is not most people though, or everyone.

Just because you/your friends do something doesn't mean other people do, or even most people do. I'd guess that for a great number of things that it would often be the opposite for technical people; often the things we do are things most people don't do.

I think people underestimate the coming confusion from video.

> It doesn't lead to much confusion or issues in our everyday life.

I mean, I think it does. I think it leads to massive issues in society where people don't know what is real or not without even knowing it. Magazine photos of people are well-known to be touched-up at a minimum, but how many people in a population actively think about that when the look at the cover?

I would posit that our society has been heavily damaged by the proliferation of fake photos.

[+] themodelplumber|7 years ago|reply
Like fake photos, these will be most highly leveraged among the undereducated. Websites like Snopes will probably help serve as an outside point of reference in many cases; however some people are not really open to criticism of whatever is their current mental model and can just as easily see another point of reference as opposition propaganda as reliable analysis. Alex Jones is an example of one with this psychology--he has such a strong trust in his original sensory-experience-analysis-system that you're better off taking other approaches than making a frontal assault on the citadel of this subjective information-interpretation experience, which is so highly knotted up with his sense of self and personal creativity.

Fortunately this kind of lopsided/over-weighted psychological subsystem will never speak to everyone, and humans are as a group becoming more resilient in the face of such imbalance. The internet has in many ways been extremely helpful in serving as a sort of blowoff valve for psychological gifts that have spun out of balance.

[+] 27182818284|7 years ago|reply
>We just assume every noteworthy image is fake unless it comes from, or is cleared by, a credible source. The same will apply to video.

As the inauguration crowd size debate proved in January of 2018, there are problems now with even credible sources on trivial things.

That's the 1-2 punch that I think might be more problematic. OK maybe you're right and people assume it is fake unless it is from a trustworthy source, but the trustworthy sources have dramatically shifted in the last--oh I'd say 18 years.

[+] crwalker|7 years ago|reply
The default approach to a medium shifting from trust to distrust is a significant change.
[+] pjc50|7 years ago|reply
Sure - so the effort switches to attacking the credibility of sources. That just pushes people further and further towards only believing the news sources they already believe, no matter how far away from a neutral and reasonable interpretation of the facts they have gone.
[+] mattnewton|7 years ago|reply
I’m going to have to ask for a citation on photoshopped imaged not being taken seriously. We routinely see people being duped by them on twitter, but it’s even more insidious in the form of Facebook advertisements that aren’t publically shared for people to debunk.
[+] albertgoeswoof|7 years ago|reply
Are there any examples of fake photographs being used in this way?

Do fake videos add any more credibility to this kind of misinformation?

[+] gattilorenz|7 years ago|reply
For the fellow Europeans that cannot read the LA Times: https://archive.fo/heTle
[+] sschueller|7 years ago|reply
I am curios what things the la times is doing with the data it collects that it still after all these months has not been able to offer a GDPR compliant page.
[+] MickerNews|7 years ago|reply
This site can’t provide a secure connection archive.fo uses an unsupported protocol. ERR_SSL_VERSION_OR_CIPHER_MISMATCH

Remember when the web just worked?

[+] S-E-P|7 years ago|reply
> Now imagine a phony video of North Korean dictator Kim Jong Un announcing a missile strike. The White House would have mere minutes to determine whether the clip was genuine and whether it warranted a retaliatory strike.

Really? Hard hitting journalism everybody. "Mere minutes", it will be warranted when they see proof of a launch, do you really think the government just decides based on a video when we have much more surefire ways to determine these things!?

[+] bitL|7 years ago|reply
Imagine somebody interested in war starts realistically faking radar etc. signatures, perhaps supported by tiny planted chips in army infrastructure. Could be fun.
[+] muthdra|7 years ago|reply
It could be a video disguised as an insider spy tip. It wouldn't cause a retaliatory strike but it certainly would cause trouble and loss of money and time.
[+] colllectorof|7 years ago|reply
I think part of the problem is that we watch too many movies with CGI and we trained ourselves to ignore it.

At this point in time face-replace videos can be relatively easily spotted if you watch them in high quality. At least the ones I've seen demonstrated.

But overall, videos are getting either to fake. At the same time, the weren't bullet-proof in the past either. The bad part is that now you can do a lot of it in near real time. So you can change something in a live report.

I see a growing need for public services that cryptographically timestamp files.

Also, I would like to see research in using machine learning to spot fake videos.

[+] kosei|7 years ago|reply
Is it just me, or does this phrasing make it seem like he's full of it?

"With further deep-learning advancements, especially on mobile devices, we'll be able to produce completely photoreal avatars in real time."

What "deep learning advancements" is he referring to?

Doesn't surprise me that he's being sued by his former VP of Engineering for fabricating the truth (and assault and battery) at all.

[+] heiki|7 years ago|reply
Our team at Mirage is working on solving exactly the same problem. Our current prototype allows users to detect deepfakes in YouTube videos. Currently very early stage and any feedback is greatly appreciated. Max video length 60 seconds. Link to demo: https://deepbuster.com/
[+] tsuberim|7 years ago|reply
I think people overestimate the bad consequences and underestimate the good consequences of such things.

1. I see amazing potential in AI based content creation. Imagine a world where you can have all the music/movies you want, personalized to your specific taste. I would love to have an AI watch Avatar The Last Airbender and invent a few more seasons for me (find out what happened to Zuko's mom :))

2. It is true that we should be more careful about what content is genuine vs fake, but cryptography has you covered, anyone can easily digitally sign the content they create with their private key and be able to prove the authenticity of their content.

3. One thing to note is that AI cannot be used to distinguish reals from fakes because the fakes are generated (using GANs) precisely so they can't be distinguished from the reals.

[+] twblalock|7 years ago|reply
There is going to be a period of time during which videos are easy to fake but people are still convinced they are real. This will lead to a lot of fake news, as well as wrongful criminal convictions.
[+] Hoasi|7 years ago|reply
This is starting to become impressive. Isn't there a huge opportunity for software able to analyze videos and verify their authenticity?
[+] EGreg|7 years ago|reply
Why not just rely on signatures and watermarks
[+] narrator|7 years ago|reply
Don't believe certain politicians are guilty of certain crimes, even if you see video evidence!
[+] anoplus|7 years ago|reply
The first worry comes to my mind is, in case of extreme inequality, the richest could buy the truth.
[+] yesenadam|7 years ago|reply
I think that's already been the case, ever since there's been lawyers for hire.
[+] imhoguy|7 years ago|reply
Isn't it the way inequality in this world works? Information is the king.
[+] Nasrudith|7 years ago|reply
The thing about video evidence is that it should be treated like witness testimony should - verified with other evidence. Manipulative editing can have similar effects like the James O'Keefe's infamous Acorn libel. If someone makes a deep-fake of Donald Trump shooting someone on main street with a rifle the lack of actual blood, 9-11 calls in the area or similar would give it away as a fake even if technically perfect.
[+] mgoetzke|7 years ago|reply
sadly the latimes still not reachable from europe
[+] gcb0|7 years ago|reply
great. now we are on the other side, with the moon landers deniers?
[+] drummyfish|7 years ago|reply
Very trashy article, doesn't even display in my country.