That is an obviously incorrect assumption, it is possible to de-anonomize most data sets about and there is reason to believe this one is no different. Health data by it's nature is very personal and specific.
Doctors publish case studies all the time which contain anonymized data. Presumably those go through reviews to make sure that nothing is being leaked but health data by it's nature is specific but not very personal (at least not identifiable).
Also, depending on what you're using ChatGPT for, this is no worse than Googling something which doctors do a lot as well.
“Write an appeal letter to a medical insurance company for a patient who needs a biopsy for a bone lesion given prior unclear diagnosis.”
Add an arbitrary ip address and timestamp and you are very far away from anything personally identifying. (Where does your computer suggest you are right now?)
xmprt|3 years ago
Also, depending on what you're using ChatGPT for, this is no worse than Googling something which doctors do a lot as well.
andrepd|3 years ago
catchnear4321|3 years ago
“Write an appeal letter to a medical insurance company for a patient who needs a biopsy for a bone lesion given prior unclear diagnosis.”
Add an arbitrary ip address and timestamp and you are very far away from anything personally identifying. (Where does your computer suggest you are right now?)
nl|3 years ago