(no title)
procedural_love | 8 years ago
Isn't the end game an endless stream of personalized content for everyone? Wherein the entire corpus of human-created media becomes a training set for our fantasies.
It is interesting how entertainment is again pushing the boundary of technology. Soon enough this push to make face editing tools for porn more accessible to everyone will allow anyone to:
1) Replace their ex-husband's face in their old family videos with their new husband's face.
2) Create a viral video of Donald Trump murdering someone.
3) Be the star of their favourite movie, porn or otherwise. (What's the effect this would have on people's memories, when they actively see themselves doing everything James Bond does, for instance? Shooting people, being generally powerful, and "getting the girl"?)
toomanybeersies|8 years ago
An abuser could make images where a person was at an event they were never at, or with a person they never met.
> "You've totally met Steve before, here's a photo of you with him, how do you not remember?"
An abuser could even more effectively tear down someones reality than ever before. If they were having an affair with someone they just met, they could claim to be old school friends catching up, just insert them into an old photo.
Obviously, it's not all bad. There is the potential for this to be used for good as well, but I'm a pessimist.
[1] https://en.wikipedia.org/wiki/Gaslighting
Bartweiss|8 years ago
It does touch on an interesting point, though: we've had roughly 100 years in which photo and audio recreations of events constitute "hard evidence" beyond our ability to fully falsify. It appears that within the next ~20 years we'll lose that reliability - footage of a politician making a dirty deal or a businessman engaging in conspiracy will become deniable not just as a misleading edit, but as outright fabrication.
What do we do at that point? Do smartphone videos get automatically hashed and uploaded to a blockchain somewhere, so that we can prove when the video came into being? Do we return to an 1850s sense of news, where claims effectively cease to be falsifiable except via personal experience? Are we ready for any of this?
psyc|8 years ago
* A browser extension that detects if the social media profile you're viewing face-matches any revenge-porn that's out there, and serves it to you
* A phone app that undresses people, or [woman < AGE], or whoever, in real-time via AI-guided compositing. Will this be considered as offensive as putting a mirror on your shoe?
* Digital VR girl/boy-friends ala the movie "Her", except with the face, body, and voice of anyone you choose
Suddenly all these things seem very close at hand.
Y_Y|8 years ago
Swizec|8 years ago
Anyone can see you fake naked at any time. Meh who cares.
Anyone can put you in any random video. Meh who cares.
Anyone can ... meh
We used to think it scandalous/offensive for someone to take photos of us. Now it’s just part of being outside. We don’t even think about the fact that everyone walka around with a camera.
touristtam|8 years ago
[1] http://www.ibtimes.com/nametag-facial-recognition-app-checks...
antisthenes|8 years ago
IntronExon|8 years ago
dragonwriter|8 years ago
There was a time when it was quite easy to find (without even trying for that specific content) photorealistic rape, snuff, bestiality, and child porn on the public web, without any AI involved.
> Illegal or not, if it’s purely virtual law enforcement is going to focus on the subset of crimes which involve actual human victims.
Actual prosecutions for virtual (generally not photorealistic) child porn in various jurisdictions demonstrate that this is not a hard and fast rule.
scandox|8 years ago
Personally I believe that even unspoken thoughts can have a strong moral dimension for the individual, though of course I see no legal dimension.
One aspect of this will be does our indulgence of our own negative fantasies weaken our capability to act rightly when presented with a real world moral choice and does that make us culpable...or more culpable if we make a wrong choice.
jstarfish|8 years ago
In the US, all of that is already illegal. If you put yourself in a position where what you possess is indistinguishable from the real thing, the courts err on the side of the potential victim.
Law enforcement's priorities are not going to change; they don't distinguish between what's virtual or not. If it looks like CP, you can't point to a producer with valid 2257 documentation and it isn't obviously a cartoon, you're cooked.
DanBC|8 years ago
I'm not convinced. https://en.wikipedia.org/wiki/United_States_v._Handley
hndamien|8 years ago
djsumdog|8 years ago
petercooper|8 years ago
We can keep going. Why would that be desirable? It hits the right chemical buttons in the brain. Drag it out far enough and we're really aiming at being blissed out brains in jars being fed shots of endorphins at the right intervals.
psyc|8 years ago
ball_of_lint|8 years ago
monksy|8 years ago
Calm down Charlie Brooker.
willejs|8 years ago
opportune|8 years ago
see: https://www.youtube.com/watch?v=ttGUiwfTYvg
unknown|8 years ago
[deleted]
icc97|8 years ago
I don't see how this is significantly different.
We could create a viral picture of Trump killing someone now.
zodPod|8 years ago
JetSpiegel|8 years ago