top | item 46676558

(no title)

josfredo | 1 month ago

I fail to understand your worry. This will change nothing regarding some people’s tendency to foster and exploit negative emotions for traction and money. “AI makes it easier”, was it hard to stumble across out-of-context clips and photoshops that worked well enough to create divisiveness? You worry about what could happen but everything already has happened.

discuss

order

acatton|1 month ago

> “AI makes it easier”, was it hard to stumble across out-of-context clips and photoshops that worked well enough to create divisiveness?

Yes. And I think this is what most tech-literate people fail to understand. The issue is scale.

It takes a lot of effort to find the right clip, cut it to remove its context, and even more effort to doctor a clip. Yes, you're still facing Brandolini's law[1], you can see that with the amount of effort Captain Disillusion[2] put in his videos to debunk crap.

But AI makes it 100× times worse. First, generating a convincing entirely video only takes a little bit of prompting, and waiting, no skill is required. Second, you can do that on a massive scale. You can easily make 2 AI videos a day. If you want to doctor videos "the old way", you'll need a team of VFX artists to do it at this scale.

I genuinely think that tech-literate folks, like myself and other hackernews posters, don't understand that significantly lowering the barrier to entry to X doesn't make X equivalent to what it was before. Scale changes everything.

[1] https://en.wikipedia.org/wiki/Brandolini%27s_law

[2] https://www.youtube.com/CaptainDisillusion

SirMaster|1 month ago

Seems rather simple to solve to me.

Just have video cameras (mostly phones these days) record a crypto hash into the video that the video sharing platforms read and display. That way we know a video was recorded with the uploader's camera and not just generated in a computer software.

There aren't that many big tech companies that are responsible for creating the devices people use to record and host the platforms and software that people use to play back the content.

haxiomic|1 month ago

The current situation is not as bad as it can get; this is accelerant on the fire and it can get a lot worse

troupo|1 month ago

I've been using "It will get worse before it gets worse" more and more lately

Nevermark|1 month ago

It really isn’t that slop didn’t exist before.

It is that it is increasingly becoming indistinguishable from not-slop.

There is a different bar of believability for each of us. None of us are always right when we make a judgement. But the cues to making good calls without digging are drying up.

And it won’t be long before every fake event has fake support for diggers to find. That will increase the time investment for anyone trying to figure things out.

It isn’t the same staying the same. Nothing has ever stayed the same. “Staying the same” isn’t a thing in nature and hasn’t been the trend in human history.

vladms|1 month ago

True for videos, but not true for any type of "text claim", which were already plenty 10 years ago and they were already hard to fight (think: misquoting people, strangely referring to science article, dubiously interpreting facts, etc.).

But I would claim that "trusting blindly" was much more common hundreds of years ago than it is now, so we might make some progress in fact.

If people learn to be more skeptical (because at some point they might get that things can be fake) it might even be a gain. The transition period can be dangerous though, as always.

jplusequalt|1 month ago

>AI makes it easier

How many people were getting quote tweeted on Twitter with deep fake porn of them before Grok could remove the clothes off your person with a simple prompt?