top | item 34903056

(no title)

xordon | 3 years ago

That is an obviously incorrect assumption, it is possible to de-anonomize most data sets about and there is reason to believe this one is no different. Health data by it's nature is very personal and specific.

discuss

order

xmprt|3 years ago

Doctors publish case studies all the time which contain anonymized data. Presumably those go through reviews to make sure that nothing is being leaked but health data by it's nature is specific but not very personal (at least not identifiable).

Also, depending on what you're using ChatGPT for, this is no worse than Googling something which doctors do a lot as well.

andrepd|3 years ago

>consent

catchnear4321|3 years ago

This is the prompt that was listed.

“Write an appeal letter to a medical insurance company for a patient who needs a biopsy for a bone lesion given prior unclear diagnosis.”

Add an arbitrary ip address and timestamp and you are very far away from anything personally identifying. (Where does your computer suggest you are right now?)

nl|3 years ago

De-anonomizing is usually done by combining datasets. That seems unlikely here.