top | item 34902702

(no title)

CabSauce | 3 years ago

You're sending patient information to a third party without any contractual agreement?

discuss

order

aabajian|3 years ago

No patient info is sent. In my colleague's example, you can try out a similar query, "Write an appeal letter to a medical insurance company for a patient who needs a biopsy for a bone lesion given prior unclear diagnosis."

neilv|3 years ago

I'd give the colleague a heads-up right away that there's a potential for "hippo violation" there. (Image: a hippo, stomping all over you.)

Then we'd make sure that our org/administration understood the risk, through whatever channels are appropriate.

alfalfasprout|3 years ago

tbh the ChatGPT saga has exposed just how willing people are to send their own/company's/client's/patient's data to a third party without a second thought.

nl|3 years ago

Without a name or other PII this seems like a misplaced concern.

CabSauce|3 years ago

It's trivial to de-identify someone from shockingly limited data. Just by submission to an outside service, they know a date, location, and whatever information was submitted. That's plenty, especially assuming they referenced a procedure or comorbidities.

xordon|3 years ago

That is an obviously incorrect assumption, it is possible to de-anonomize most data sets about and there is reason to believe this one is no different. Health data by it's nature is very personal and specific.

mtlmtlmtlmtl|3 years ago

Why would a letter to an insurance company on behalf of a patient not include PII? The letter itself is surely mostly PII. And is almost certain to contain privileged information.

cardosof|3 years ago

Probably patients consent to using software to process their data and chatgpt is considered just another software in the stack. Edit: parent poster said no info is sent.

bfeynman|3 years ago

that's not how it works, PHI and HIPAA usually supersede a lot of these. That's why you hospitals don't just have everyone opt-out data protections if they want services.

CabSauce|3 years ago

That would require a Business Associate contract between the provider and Open AI.

lp0_on_fire|3 years ago

They probably consent to software that's been validated and managed by their medical provider. I can tell you for sure that I personally wouldn't consent to a a random script thrown together by an MD with leetcode experience that sends your medical data up to an experimental service.