top | item 45381192

(no title)

kbos87 | 5 months ago

This is like saying that self-driving cars won't ever become a thing because someone behind the wheel needs to be to blame. The article cites AI systems that the FDA already has cleared to operate without a physicians' validation.

discuss

order

tw04|5 months ago

> This is like saying that self-driving cars won't ever become a thing because someone behind the wheel needs to be to blame.

Which is literally the case so far. No manufacturer has shown any willingness to take on the liability of self driving at any scale to date. Waymo has what? 700 cars on the road with the finances and lawyers of Google backing it.

Let me know when the bean counters sign off on fleets in the millions of vehicles.

hnaccount_rng|5 months ago

You also have Mercedes taking responsibility for their traffic-jam-on-highways autopilot. But yeah. It's those two examples so far (not sure what exactly the state of Tesla is. But.. yeah, not going to spend the time to find out either)

CSSer|5 months ago

I'm curious how many people would want a second opinion (from a human) if they're presented with a bad discovery from a radiological exam and are then told it was fully automated.

I have to admit if my life were on the line I might be that Karen.

rogerrogerr|5 months ago

A bad discovery probably means your exam will be read by someone qualified, like the surgeon/doctor tasked with correcting it.

False negatives are far more problematic.

captainkrtek|5 months ago

Id be more concerned about the false negative. My report says nothing found? Sounds great, do I bother getting a 2nd opinion?

mike_ivanov|5 months ago

Self-care is being Karen since when?

aprilthird2021|5 months ago

The FDA can clear whatever they want. A malpractice lawyer WILL sue and WILL win whenever an AI mistake slips through and no human was in the loop to fix the issue.

It's the same way that we can save time and money if we just don't wash our hands when cooking food. Sure it's true. But someone WILL get sick and we WILL get in trouble for it

fkyoureadthedoc|5 months ago

What's the difference in the lawsuit scenario if a doctor messes up? If the AI is the same or better error rate than a human, then insurance for it should be cheaper. If there's no regulatory blocks, then I don't see how it doesn't ultimately just become a cost comparison.

the_real_cher|5 months ago

yeah but at some point the technology will be sufficient and it will be cheaper to pay the rare $2 million malpractice suit then a team of $500,000/yr radiologists

theres an MBA salivating over that presntation

alexpotato|5 months ago

This is essentially what's happened with airliners.

Planes can land themselves with zero human intervention in all kinds of weather conditions and operating environments. In fact, there was a documentary where the plane landed so precisely that you could hear the tires hitting the center lane marker as it landed and then taxied.

Yet we STILL have pilots as a "last line of defense" in case something goes wrong.

frenchman_in_ny|5 months ago

No - planes cannot "land themselves with zero human intervention" (...). A CAT III autoland on commercial airliners requires a ton of manual setting of systems and certificated aircraft and runways in order to "land themselves" [0][1].

I'm not fully up to speed on the Autonomi / Garmin Autoland implementation found today on Cirrus and other aircraft -- but it's not for "everyday" use for landings.

[0] https://pilotinstitute.com/can-an-airplane-land-itself/

[1] https://askthepilot.com/questionanswers/automation-myths/

victorbjorklund|5 months ago

One difference there would be that the cost of the pilots is tiny vs the rest that goes into a flight. But I would bet that the cost of the doctor is a bigger % of the process of getting an x-ray.

UltraSane|5 months ago

Tesla still hasn't accepted liability for crashes caused by FSD. They in fact fight any such claims in court very vigorously.

otterley|5 months ago

They have settled out of court in every single case. None has gone to trial. This suggests that the company is afraid not only of the amount of damages that could be awarded by a jury, but also legal precedent that holds them or other manufacturers liable for injuries caused by FSD failures.

avh02|5 months ago

Tesla isn't the north star here

trueismywork|5 months ago

At the end of day, there's a decision needs to be made and decisions have consequences. And in our current society, there are only one way we know about how to make sure that the decision is taken with sufficient humanity: by putting a human to be responsible for making that decision.

hliyan|5 months ago

Very questionable reasoning: using a traffic analogy to argue against medical reality.

constantcrying|5 months ago

Medicine does not work like traffic. There is no reason for a human to care whether the other car is being driven by a machine.

Medicine is existential. The job of a doctor is not to look at data, give a diagnosis and leave. A crucial function of practicing doctors is communication and human interaction with their patients.

When your life is on the line (and frankly, even if it isn't), you do not want to talk to an LLM. At minimum you expect that another human can explain to you what is wrong with you and what options there are for you.

victorbjorklund|5 months ago

You often don't speak to the radiologist anyway. Lots of radiologist work remotely and don't meet and speak with every patient.

philipallstar|5 months ago

There's some sort of category error here. Not every doctor is that type of doctor. A radiologist could be a remote interpretation service staffed by humans or by AI, just as sending off blood for a blood test is done in a laboratory.

FireBeyond|5 months ago

> There is no reason for a human to care whether the other car is being driven by a machine.

What? If I don't trust the machine or the software running it, absolutely I do, if I have to share the road with that car, as its mistakes are quite capable of killing me.

(Yes, I can die in other accidents too. But saying "there's no reason for me to care if the cars around me are filled with people sleeping while FSD tries to solve driving" is not accurate.)

ACCount37|5 months ago

So, you need a moral support human? Like a big plushie, but more alive?