It depends who the end user is. As an aid for a trained physician, who is in a better position to spot the hallucinations, it may be fine, whereas a self-medicating patient could be at risk.
We absolutely need more resources in healthcare throughout the world, and it may be that these models, or even AGI, have great potential as a companion for e.g. Doctors Without Borders or even at the local hospital in the future. But there’s quite a bit more nuance to giving medical advice compared to perfecting a self driving car.
robertlagrant|2 years ago
Yes a patient could be at risk - they're at risk from everything, including a poorly trained/outdated doctor. And even more at risk from just not having access to a doctor. That's the point: it's a risk on both sides; weighing competing risks is not whataboutism.