I'm reminded of the time GPT4 refused to help me assess the viability of parking a helium zeppelin an inch off of the ground to bypass health department regulations because, as an aircraft in transit, I wasn't under their jurisdiction.
The other side of this problem is the never ending media firestorm that occurs any time a crime or tragedy occurs and a journalist tries to link it to the perpetrator’s ChatGPT history.
You can see why the LLM companies are overly cautious around any topics that are destined to weaponized against them.
> You can see why the LLM companies are overly cautious around any topics that are destined to weaponized against them.
It's not that at all. It's money.
The law is currently ambiguous regarding LLMs. If an LLM causes harm it hasn't been defined if the creators of the LLM are at fault or the end user.
The IT companies would much prefer the user be at fault. Because if it's the other way then it becomes a minefield to build these things and will slow the technology way down.
But there have been a number of cases already from suicide to fraud related to LLMs. So it's only a matter of time before it gets locked down.
Of course removing safeguards on an LLM makes it quite clear that the person who did that would be at fault if they ever used it in the real world.
I mean, when kids are making fake chatbot girlfriends that encourage suicide and then they do so, do you 1) not believe there is a causal relationship there or 2) it shouldnt be reported on?
I remember when it first came out, I was watching an Agatha Christie movie where somebody got chloroformed and was trying to ask GPT4 about the realism of if. Had to have a multi-turn dialog to convince it I wasn’t trying chloroform anyone and was just watching a movie.
Technically in their airspace though so you might be in bigger trouble than parking.
If you tether it to an asphalt ground hook you can claim it’s a tarmac and that it’s “parked” for sake of the FAA. You’ll need a “lighter-than-air” certification.
There's that maniac who is building a quad-copter skateboard contraption who got in trouble with the FAA who successfully reported that he was flying, but got fined for landing at a stoplight.
If the spirit of a law is beneficial, it can still be hacked to evil ends.
This isnt the failure of the law, its the failure of humans to understand the abstraction.
Programmers should absolutely understand when theyre using a high level abstraction to a complex problem.
Its bemusing when you seem them actively ignore that and claim the abstraction is broken rather than the underlying problem is simply more complex and the abstraction is for 95% of use cases.
"Aha," the confused programmer exclaims, "the abstraction is wrong, I can still shoot my foot off when i disable the gun safety"
Aurornis|3 months ago
You can see why the LLM companies are overly cautious around any topics that are destined to weaponized against them.
EagnaIonat|3 months ago
It's not that at all. It's money.
The law is currently ambiguous regarding LLMs. If an LLM causes harm it hasn't been defined if the creators of the LLM are at fault or the end user.
The IT companies would much prefer the user be at fault. Because if it's the other way then it becomes a minefield to build these things and will slow the technology way down.
But there have been a number of cases already from suicide to fraud related to LLMs. So it's only a matter of time before it gets locked down.
Of course removing safeguards on an LLM makes it quite clear that the person who did that would be at fault if they ever used it in the real world.
Angostura|3 months ago
Or, as a different way of framing it - when it can be directly linked to the perpetrator’s ChatGPT history
JohnMakin|3 months ago
m4rtink|3 months ago
IshKebab|3 months ago
pants2|3 months ago
andy99|3 months ago
Ironically, if I’d just said “how did people knock someone out with chloroform in the 1930s?” it would have just told me. https://github.com/tml-epfl/llm-past-tense
The models are much better now at handling subtlety in requests and not just refusing.
reactordev|3 months ago
If you tether it to an asphalt ground hook you can claim it’s a tarmac and that it’s “parked” for sake of the FAA. You’ll need a “lighter-than-air” certification.
michaelbuckbee|3 months ago
cyanydeez|3 months ago
This isnt the failure of the law, its the failure of humans to understand the abstraction.
Programmers should absolutely understand when theyre using a high level abstraction to a complex problem.
Its bemusing when you seem them actively ignore that and claim the abstraction is broken rather than the underlying problem is simply more complex and the abstraction is for 95% of use cases.
"Aha," the confused programmer exclaims, "the abstraction is wrong, I can still shoot my foot off when i disable the gun safety"