top | item 38455939

(no title)

blipmusic | 2 years ago

It depends who the end user is. As an aid for a trained physician, who is in a better position to spot the hallucinations, it may be fine, whereas a self-medicating patient could be at risk. We absolutely need more resources in healthcare throughout the world, and it may be that these models, or even AGI, have great potential as a companion for e.g. Doctors Without Borders or even at the local hospital in the future. But there’s quite a bit more nuance to giving medical advice compared to perfecting a self driving car.

discuss

order

robertlagrant|2 years ago

A self driving car can cause incredible damage straight away. I don't think you should underestimate that. But we also don't have enough healthcare access, so the need is more urgent than that for automated drivers, the health benefit of which is often only about reducing risk of driving while tired or intoxicated.

Yes a patient could be at risk - they're at risk from everything, including a poorly trained/outdated doctor. And even more at risk from just not having access to a doctor. That's the point: it's a risk on both sides; weighing competing risks is not whataboutism.