I worked for Hospira last year on the Plum A+ and 360 infusion pumps but not the PCA. I'm a little surprised such a blatant security hole wasn't caught, considering the magnitude of the regulatory environment we worked under. I'd never worked in that kind of environment before, and I left shortly after a successfully defended audit of our software development and tracking process and systems. (Although how successful is the development and tracking process if a year later this "bug" comes out?) My guess is that because we were beaten over the head day in and day out focusing on "if this software delivers the wrong medication the patient is probably going to die", and "if you make a mistake in the development process and in change tracking we can lose the ability to make these devices" that the idea of defending against malicious intent was de-emphasized/overlooked.
voidlogic|10 years ago
jwiley|10 years ago
voidlogic|10 years ago
1. My question was unrelated to the poor decision the OPs former employer made to use telnet.
2. Formal verification the the gold standard in correctness often used for can't fail things like missiles and satellites.
lifeisstillgood|10 years ago
So any cost benefit or risk analysis is going to worry about the drug.
Nitramp|10 years ago
But in my experience, the assumption that a security, quality, or whatever certification process correlates with actually secure, high quality, etc software does not have a lot backing it.
There's a big difference between dotting the i's and crossing the t's according to ISO-9001, and actually caring about software quality, and it seems like the standards make it harder to actually care and actually focus on delivering a good product.
mryan|10 years ago