No patient info is sent. In my colleague's example, you can try out a similar query, "Write an appeal letter to a medical insurance company for a patient who needs a biopsy for a bone lesion given prior unclear diagnosis."
tbh the ChatGPT saga has exposed just how willing people are to send their own/company's/client's/patient's data to a third party without a second thought.
It's trivial to de-identify someone from shockingly limited data. Just by submission to an outside service, they know a date, location, and whatever information was submitted. That's plenty, especially assuming they referenced a procedure or comorbidities.
That is an obviously incorrect assumption, it is possible to de-anonomize most data sets about and there is reason to believe this one is no different. Health data by it's nature is very personal and specific.
Why would a letter to an insurance company on behalf of a patient not include PII? The letter itself is surely mostly PII. And is almost certain to contain privileged information.
Probably patients consent to using software to process their data and chatgpt is considered just another software in the stack.
Edit: parent poster said no info is sent.
that's not how it works, PHI and HIPAA usually supersede a lot of these. That's why you hospitals don't just have everyone opt-out data protections if they want services.
They probably consent to software that's been validated and managed by their medical provider. I can tell you for sure that I personally wouldn't consent to a a random script thrown together by an MD with leetcode experience that sends your medical data up to an experimental service.
aabajian|3 years ago
neilv|3 years ago
Then we'd make sure that our org/administration understood the risk, through whatever channels are appropriate.
alfalfasprout|3 years ago
nl|3 years ago
CabSauce|3 years ago
xordon|3 years ago
mtlmtlmtlmtl|3 years ago
cardosof|3 years ago
bfeynman|3 years ago
CabSauce|3 years ago
lp0_on_fire|3 years ago