top | item 39014347

(no title)

pineal | 2 years ago

An important first step. I am early in my practice and I fully expect medicine as I know it today to become unrecognizable within my lifetime. I am optimistic that it will be done carefully and will ultimately tremendously benefit our patients.

That being said, I had to laugh, this is one hyperbolic headline. There are of course some caveats.

"Few efforts to harness LLMs for medicine have explored whether the systems can emulate a physician’s ability to take a person’s medical history and use it to arrive at a diagnosis. Medical students spend a lot of time training to do just that, says Rodman. “It’s one of the most important and difficult skills to inculcate in physicians.”

This does take a lot of training and experience. The challenge in diagnosis isn't about asking a history and integrating the information, it's about effectively encouraging patients to provide necessary information that they might not realize is relevant or know how to articulate. Different patients often require wildly different approaches. Medical literacy does play a role, but a patient that would say, "Hi doctor, I experienced central chest pain accompanied by discomfort in the upper stomach that happened two hours ago" (from pg 33 of the preprint) is not realistic. More likely you get a vague complaint of "heartburn that started a while ago".

Similarly: "Currently, I'm not on any prescribed medications". More frequently you get something like "I take a blue one" or "Golly Telly" (Go-Lytely) or "Gabatini" (Gabapentin). I do think an LLM could probably parse these but such idiosyncrasies in a history do compound. And though someone may be prescribed a med and think they take it as prescribed, sometimes it takes a hunch and clinical experience to tease out that in fact the med is not being taken at all as indicated.

Moreover, the better bedside manner was assessed via text conversation. I wouldn't quite call that "bedside" manner. I also wonder how such a system will deal with patients that have self-diagnosed themselves and looked up the "right answers" to get what they want -- a difficult reality that takes some parsing to figure out what's real and what's not.

Overall though, the preprint, in contrast to the Nature news article/headline, does a better job of discussing these limitations. I congratulate them on excellent work. Thank god LLMs can't do surgery, though I'm sure my time will come as well.

discuss

order

No comments yet.