top | item 12813748

(no title)

arthursilva | 9 years ago

Why hack a blood service system at first place? Oh my. Not that people should hack anything, but this is extra nasty.

discuss

order

red_admiral|9 years ago

They didn't. They scanned the IPv4 address space for servers with directory listing enabled and ".sql" files visible, and happened to find one at Australia's red cross.

munin|9 years ago

The article says that the system wasn't hacked so much as a database backup was left on an unsecured, public site and was discovered. It seems plausible, from the described timeline of events, that the discoverer didn't know what they had until they had downloaded and examined the data.

What kind of sucks here is that there's a complex chain of trust from donors to medical staff that take the donations and process the paperwork to a computer technician that for whatever reason exposed the data to the world. Everyone involved in that kind of medical activity has to have an intense and professional respect for patient dignity. The thing that sucks is you could look at this data and realize that due to the actions of a nameless computer janitor, the assurance you gave to people whose blood you took about the privacy of their participation is worthless.

The other thing that sucks is to think about the kind of care that you should take, as a computer janitor, with this kind of information. Seven million data points about things as sensitive as "at-risk sexual behavior" correlated with full names and addresses? How do you even process how careful you should be? I would have three different people double check everything I did! I would go to sleep every night paralyzed in fear that I made some tiny mistake!

Okay, as an aside, when you're designing IT systems and data storage policies for research, your ethics and review people wouldn't let you do this. More specifically, in my (collective) experience, it seems very, very unlikely that a research institutions ethics review board would let a researcher store data given in confidence associated with non-anonymous identifiers in a computer system. Usually, they ask that you don't even link identifiable information (which you need to record that you received consent) with results. They should be totally separate. And for "high-risk" data, they would ask that the consent data not even be stored electronically (one reason: you might think that data collected "for science" is in some way exempt from disclosure to say, law enforcement. It isn't. So if you want to do research about say, drug use in pregnant and nursing mothers, you are literally gathering evidence about crimes that the police could use to prosecute your research subjects. Don't harm your research subjects - make the data that you store, the data that the police could seize with a warrant, useless for that purpose).

Anyway, IMO, it seems a failure here is that someone at RC sat down to make a DB schema and didn't see a problem with having fields like "Legal name" and "Permanent address" in the same table (or even system!) as fields like "have you engaged in at-risk sexual behavior." There are safeguards in other fields against even making systems that correlate this kind of information. Why aren't they present here?

noir_lord|9 years ago

I'm in the early stages of planning a system that holds medical data as a side project and the security aspect is giving me real pause about going ahead, I just don't know if I can reach a level of security I'm comfortable with while allowing users to enter that kind of data.

The existing standards are mostly crap and I'm a generalist not a security expert.