top | item 39491510

Lawyer fined for legal filings that included 'hallucinated' AI citations

76 points| ilamont | 2 years ago |universalhub.com | reply

78 comments

order
[+] mateo1|2 years ago|reply
He should have lost his license after keeping up. There should be a serious fine (at least 10x the time of other people they have wasted) and loss of professional licenses upon the first repeat of the offense. You want to use AI to discover similar cases? You want to use it for ideas on how to structure your line of defense? Good. The moment you submit it's output under your name though, expect repercussions.
[+] solfox|2 years ago|reply
Fair, but having experienced family court firsthand, the bar does not care if you lie. It's considered fair play in court. How is this any different?
[+] CPLX|2 years ago|reply
Don't be ridiculous. Technology is confusing.

A person should spend decades of their life studying law and practicing it and then have their career ended because they didn't realize that this new tool everyone said was so amazing was fundamentally and conceptually different from every search engine he'd seen before and (unlike what he otherwise would have used, Westlaw) comes up with random made up but very plausible sounding precedents when you ask it for legal research?

Embarrassed, called out, fined? Of course. But that's not a serious argument. Unless you think someone should be legally prohibited from ever being paid as a programmer for the rest of their life for blowing up a server or failing to maintain a backup system or something

This is a mistake, and a pretty easy to understand one. It's not stealing client's money or tampering with a witness or something.

[+] BlueTemplar|2 years ago|reply
The first one is not a lawyer, so cannot be disbarred.

The second one allegedly only quickly checked the examples provided by his associates.

[+] m_eiman|2 years ago|reply
Can we stop call AIs giving incorrect information ”hallucinations”, please? It’s just a clever PR stunt to sweep the glaring issues under a carpet.
[+] pxmpxm|2 years ago|reply
Intractable model error that's elemental to the approach won't get you any funding though.

Anthropomorphizing statistical learning is how you build a hype machine to cash out people with zero handle on the subject. See the comment below about "AI judges" and "true justice". Just like early electricity, all people see is magic.

[+] add-sub-mul-div|2 years ago|reply
No, it's good that the public understands that AIs are wrong so regularly that we need a special word dedicated to this one specific manner in which they're wrong.

Generative AI output is becoming inextricably associated with this word, and that's not a bad thing to keep people aware of.

[+] linkjuice4all|2 years ago|reply
Any recommendations? The public seems to actually understand what this means although it’s just more anthropomorphization of a random bullshit generator.
[+] lolinder|2 years ago|reply
There was a similar incident last year [0], but in that case the lawyer actually doubled down on the fake cases and, when pressed to produce evidence of the fake cases, submitted screenshots of them asking ChatGPT to confirm that the cases were real.

At least this one had the good sense to apologize instead!

[0] https://news.ycombinator.com/item?id=36097900

[+] g42gregory|2 years ago|reply
I am not sure how is this news. If a lawyer submits filings with non-factual things in them, will he get fined? - Yes. How should this be different when using AI tools? Are AI tools statistical in nature and will produce incorrect answers sometimes? - For sure. Therefore, proceed with caution and verify the entire filing, as you are the one signing it.
[+] leobg|2 years ago|reply
Congrats for having judges who actually read the citations. As a lawyer in Germany, I find that most judges will barely read the body of a brief, let alone look up citations.
[+] yieldcrv|2 years ago|reply
Sloppy lawyer, they need to clean their office up
[+] danielfoster|2 years ago|reply
I found it astounding that the judge accepted the lawyer's excuse that two recent law graduates were responsible for the AI citations. No recent graduate would use AI in this manner. Everyone deserves a second chance, but I would have increased the fine to $10,000 for this clear failure to accept responsibility.
[+] lolinder|2 years ago|reply
Why would no recent graduate use AI in this manner?

I'd imagine that a lot of law schools have started warning students against it, but in other fields there's often quite a pronounced lag between changes in the field and changes in the curriculum. Even if the best schools have adapted, I'd be shocked if there aren't a bunch that haven't.

[+] frognumber|2 years ago|reply
Fines for lawyers are set at slap-on-the-wrist levels. Violating legal ethics almost always carries higher expected benefit than expected penalty, which is why most lawyers are crooks (even ones who don't want to be). It's impossible to be a competitive lawyer in many domains while still following legal ethics.

In this case, 10% odds of getting caught · $2000 = $200 per violation, generously, so less than one billable hour.

[+] solfox|2 years ago|reply
One day, not too long from now, the judge will be an AI, lawyers will not exist, and true justice will be here.
[+] ysofunny|2 years ago|reply
you may as well prepend "in the kingdom of heaven" like so:

"in the kingdom of heaven the judge is a perfect AI, lawyers do not exist, and true justice does happen normally"

[+] jprete|2 years ago|reply
Why, exactly, do you believe that AI will be more just than human judges?
[+] zer00eyz|2 years ago|reply
No, god no. Fuck no. I will get off the couch and take up arms if we even get close to thinking about this.

After googles recent "alignment" debacle never.