Can someone pleasee convince why i shouldn't be absolutely shit out of my mind cynical about this innovation? we are literally seeing the downfall of trust in society. and no, i dont believe i am exaggerating
We have adapted to monumental shifts in how we develop trust in society for as long as society has existed - from the printing press to photography to the Internet to CGI to ...
I don't see this as any different. We will determine new ways of establishing trust. They'll certainly have flaws, as establishing trust in a society always has, but we'll learn to recognize those flaws and hopefully fix them.
Beyond that, what's the alternative? Banning the technology? That doesn't seem feasible for various reasons, not least of which is it isn't going to stop bad actors. Another pretty good reason is it's just not really possible - anyone with enough compute can build LLMs now.
As a bit of an aside, why hasn't society fallen yet? I mean, ChatGPT has been around for a couple years now, and I've been hearing about how LLMs are the single greatest threat to civilized society we've ever faced... yet they don't seem to have had a major impact.
The big difference is, there was a very high bar to forging photographs, and most news was gated (with the ability to easily find, and sue those guilty of slander/libel).
Now it's utterly simplistic to forge, to libel, to slander, and there is no easy path in many cases to sue.
While you can say "yes, but..." to the above, that's the reality that we've lived with for 150 years, less extremely rare edge cases. All this has changed over the course of a couple of decades, with most of that change in the last 10, and focuses on the last 2 years.
Beyond that, it took significant effort and labour to create fake stories and images. People had to be experts, or be wordsmiths. Now, click click, and fake generated stories abound. In fact, they're literally everywhere. There's absolutely no comparison here.
Now, in the time it used to take one person to generate one fake story, you can generate millions and trillions if you have the cash. Really, it's the same problem with spam phone calls, and with spam email.
You didn't get 1000 spam letters in the mail in the 80s, because that cost money. Email was free, thus spam became plentiful. The same with spam phone calls, it cost hard cash for each call, now it's pennies per hundreds of automated calls, so spam phone calls abound.
The same is happening with all content on the internet. Realistically the web is now dead. It's now gone. Even things such as wikipedia are going to die, as over the next 2 to 3 years LLM output will become utterly and completely indistinguishable in all aspects.
If you mean voice cloning, they aren't bringing that tech to market. (Someone else will, though.)
Similarly, Google doesn't let you use face matching in image search to find every photograph of you on the web, even though they could, and quite similar technology is built into Google Photos.
I think the other comments make a good argument about how other forms of technology have also degraded trust, but that we've found a way through. I'll also add that I think one potential way we could reinstate trust is through signed multimedia. Cameras/microphones/etc could sign the videos/audio they create in a way that can be used to verify that the media hasn't been doctored. Not sure if that's actually a feasible approach, but it's one possibility.
It's feasible with advanced enough tech. The hard part isn't getting cameras to sign the files they produce. The hard part is to preserve the chain of custody as images are cropped, rescaled, recompressed etc. You can do it with tech like Intel SGX. But you also need serious defense of the camera platforms against hacking, of the CPUs, of the software stacks. And there's no demand. News orgs feel they should be implicitly trusted due to their brands, so why would they use complicated tech to build trust?
The way I see it LLMs are just making it more obvious to see all the flaws with the existing levels of trust. Humans have never had access to universal truths, or universal ways of validating anything. Any claim anybody makes could be intentionally or unintentionally deceitful or untrue. The idea that there are sources you can trust to do your thinking for you is the more dangerous illusion in my opinion, and I’m not convinced that society will be harmed by poking some holes through it.
> The idea that there are sources you can trust to do your thinking for you is the more dangerous illusion in my opinion, and I’m not convinced that society will be harmed by poking some holes through it
There is no alternative to this idea. It is completely impossible for an individual to possess all of the knowledge of everything that affects their lives. The only option for getting some of this information is going to trusted sources that compile it and present some conclusions.
This applies just as much to scientific knowledge as it does to medicine or to politics.
If you want to avoid trusting any authority, it's hard to even confirm that North Korea exists. Confirming that it is ruled by an authoritarian regime and that it possesses nuclear weapons is impossible. And yet it's a trivial bit of info that everyone agrees on - imagine what avoiding trusted authorities would do to knowledge about other more subtle or more controversial topics.
> The idea that there are sources you can trust to do your thinking for you is the more dangerous illusion in my opinion
The difference between economically successful countries like the US and the peripheral countries is we are a high-trust society.
I don't spend 100 hours chemically testing my food because I have faith it is safe to eat. I don't waste money on scam after scam because I have faith most businesses are legitimate. If I'm a business, I can order stuff and more stuff and I trust the spec.
Our outsourcing of that trust to other people is what makes us economically successful.
Other countries which don't have this trust focus on basic tasks. Gathering food, water, shelter, and basic infrastructure. Because ultimately every man is out for himself. They aren't building software and airplanes and whatnot. Because as complexity increases, the more people are involved and therefore the most trust is required. Trust is required because of the fundamental limitations of human meat space - we have limited time and survival needs.
Attacks on trust have been around for as long as we have had trust. Generative AI makes some attack vectors easier but it's nothing new: if your trust is earned using a voice you recognise then you model for trust has been broken since before most of us were born.
squigz|1 year ago
I don't see this as any different. We will determine new ways of establishing trust. They'll certainly have flaws, as establishing trust in a society always has, but we'll learn to recognize those flaws and hopefully fix them.
Beyond that, what's the alternative? Banning the technology? That doesn't seem feasible for various reasons, not least of which is it isn't going to stop bad actors. Another pretty good reason is it's just not really possible - anyone with enough compute can build LLMs now.
As a bit of an aside, why hasn't society fallen yet? I mean, ChatGPT has been around for a couple years now, and I've been hearing about how LLMs are the single greatest threat to civilized society we've ever faced... yet they don't seem to have had a major impact.
b112|1 year ago
Now it's utterly simplistic to forge, to libel, to slander, and there is no easy path in many cases to sue.
While you can say "yes, but..." to the above, that's the reality that we've lived with for 150 years, less extremely rare edge cases. All this has changed over the course of a couple of decades, with most of that change in the last 10, and focuses on the last 2 years.
Beyond that, it took significant effort and labour to create fake stories and images. People had to be experts, or be wordsmiths. Now, click click, and fake generated stories abound. In fact, they're literally everywhere. There's absolutely no comparison here.
Now, in the time it used to take one person to generate one fake story, you can generate millions and trillions if you have the cash. Really, it's the same problem with spam phone calls, and with spam email.
You didn't get 1000 spam letters in the mail in the 80s, because that cost money. Email was free, thus spam became plentiful. The same with spam phone calls, it cost hard cash for each call, now it's pennies per hundreds of automated calls, so spam phone calls abound.
The same is happening with all content on the internet. Realistically the web is now dead. It's now gone. Even things such as wikipedia are going to die, as over the next 2 to 3 years LLM output will become utterly and completely indistinguishable in all aspects.
jay_kyburz|1 year ago
We should train users to ignore news from unverified sources.
We should observe and track the reputation of journalists, and stop broadcasting testimony that is untrustworthy.
bcherny|1 year ago
skybrian|1 year ago
Similarly, Google doesn't let you use face matching in image search to find every photograph of you on the web, even though they could, and quite similar technology is built into Google Photos.
throwaway48540|1 year ago
Panoramix|1 year ago
cdrini|1 year ago
mike_hearn|1 year ago
dragonwriter|1 year ago
AmericanChopper|1 year ago
tsimionescu|1 year ago
There is no alternative to this idea. It is completely impossible for an individual to possess all of the knowledge of everything that affects their lives. The only option for getting some of this information is going to trusted sources that compile it and present some conclusions.
This applies just as much to scientific knowledge as it does to medicine or to politics.
If you want to avoid trusting any authority, it's hard to even confirm that North Korea exists. Confirming that it is ruled by an authoritarian regime and that it possesses nuclear weapons is impossible. And yet it's a trivial bit of info that everyone agrees on - imagine what avoiding trusted authorities would do to knowledge about other more subtle or more controversial topics.
consteval|1 year ago
The difference between economically successful countries like the US and the peripheral countries is we are a high-trust society.
I don't spend 100 hours chemically testing my food because I have faith it is safe to eat. I don't waste money on scam after scam because I have faith most businesses are legitimate. If I'm a business, I can order stuff and more stuff and I trust the spec.
Our outsourcing of that trust to other people is what makes us economically successful.
Other countries which don't have this trust focus on basic tasks. Gathering food, water, shelter, and basic infrastructure. Because ultimately every man is out for himself. They aren't building software and airplanes and whatnot. Because as complexity increases, the more people are involved and therefore the most trust is required. Trust is required because of the fundamental limitations of human meat space - we have limited time and survival needs.
aimazon|1 year ago
https://en.wikipedia.org/wiki/Impersonator
pizzafeelsright|1 year ago
Contracts allow for recompense if there is mistrust. We've been signing contracts forever.
Every automated system needs someone to jail in the event of failure.
iaaan|1 year ago