Once again, I call on Michelle Obama or the next first man/lady to make cyber security your cause.
We need to teach our children to understand that only a fool would put critical systems on the public internet. Only a fool would forget to implement account lockout rules or forget login rate limiting (edit) --where approprate--. Only a fool would create software with built in default usernames and passwords. Hello world, the "admin" username does not have to be admin or administrator. Computer security today is a joke. Almost a total illusion.
We need to pay less attention to Kim Kardashashen and more attention to HD Moore.
Due to the nature of critical infrastructure it would not be advisable to force lockout rules and rate limiting on devices.
The main issue comes from the long life cycle of equipment and companies not wanting to change to new securer methods due to fear of costs implementing it.
While I agree with you overall I don't see the problem being solved by "teaching our children". The problem isn't the public, which either can't be expected to know or are actually already more critical of technology, it's programmers being blinded by money, status, false pride and overwhelming odds against doing the right thing. The security industry isn't exactly helping either with their toxic environment making building security from the the ground up a loosing proposition.
How many times do we see new web frameworks on HN without even a mention of security?
Recent media reports, in April 2009, for example, highlighted penetrations of the U.S. electricity system by hackers. In November 2009, 60 Minutes aired a piece confirming rumors of break-ins to the Brazilian energy system in 2005 and 2007. The Nuclear Regulatory Commission confirmed that in January 2003, the Microsoft SQL Server worm known as “Slammer” infected a private computer network at the Davis-Besse nuclear power plant in Oak Harbor, Ohio, and disabled a safety monitoring system for nearly five hours. Fortunately the plant was off-line at the time. In January 2008, the Central Intelligence Agency reported knowledge of four disruptions, or threatened disruptions, by hackers of the power supplies for four cities.
Clearly, these attacks have been ongoing for a while. NAE also points out keeping software updated can lag far behind the threats:
Another problem today is that security patches are sometimes not supplied to end-users, or they are supplied but are not applied for fear of impacting system performance. Current practice is to apply an upgrade/patch only after SCADA vendors have thoroughly tested and validated it, which can sometimes take several months.
Like it or hate it, it's the word that everybody understands. The words "computer" or "digital" or "network" or "data" etc are insufficient. Clearly "computer security" is not the same as "cybersecurity", just as "network security" is different from "cybersecurity".
The Eskimo/Inuit have over 50 words for "snow", in today's world, we will continue to add words to better describe our digital world.
PS - cyber, cyber, cyber, cyber, cyber. Get used to it.
Article briefly addresses that. They talk about how air gapping is easy to breach with a compromised flash drive.
But speaking from experience in the controls industry its the people who write the cheques. I don't work in systems like a nuclear power plant but the critical systems I help design are absolutely air gapped and secured from all basic user access. The business managers who want to remotely administer their system or check the data logs are the ones that introduce components susceptible to attack and hopefully integrators learn to stand their ground.
Another issue I see frequently, and not to discredit the brilliant engineers I work with, but there's too many non-software people in the software field. Too many electrical oriented people I work with figure software is just a slap on system that interfaces the user to the motor but fail to realize all the pitfalls it creates if not done in as rigorous of a manner as the electrical design.
Most reactor operation is still human beings (following thick, certified manuals) flipping switches. You don't want buggy software anywhere near a reactor, especially when you're making just a couple of control rod movements a week.
Someone got contract money to report that nuclear is scary, cyber is scary, so cyber nuclear is therefore really scary, conclusion, more money needs to be sent their way.
The interesting parts are not being covered, aside from the reported grubbing for money. I skimmed the 50+ page report and its very unusual to have IT security staff on site 24x7. The infosec folks don't share incident reports like the nuclear physics community always has (at least in the USA). Nobody does drills where they assume the computers are powned so go manual/verbal. IDS systems are not usually deployed. Patching fixes security holes which are not tracked but results in downed systems and uptime hits which are tracked so you get one guess as to the priority of patching. The IT supply chain is not managed to the military aerospace level of examination that, say, welding gear is managed to at the plants.
The journalist reports of problems are bogus. However, the actual report pages 14 thru 17 were pretty interesting reading. The PLC at Browns Ferry is a typical story, they accidentally DOS'd the VFD controller for a circulation pump, so the other eighty billion procedures to protect the plant kicked in and they shut the plant down, an intentional attack would have had the same result. The Hatch story is a good example of just why plant operators hate patches, a poorly applied patch shut down the plant for days due to a SCADA misreading, if they didn't patch the plant would not have been shut down (or maybe it would have been powned later?). Somewhat instructive story about the Korea plant that got their HR database completely powned and its treated as a "nuclear plant attack" even though it was just boring HR pownership like could happen to a food store or something.
The report explains in great detail how toxic security is currently implemented, where the nuclear engineers set everything up, and once it all works, as the last step, the infosec guys try to sprinkle magic security pixie dust and checkboxes, and they try to explain in manager language why thats possibly the dumbest possible way to build a secure system, its really pretty well written. Around page 31 of the report.
Cloudy Virtualization confuses people, both nuke and infosec, so you'll have some PLC in a janitors closet but put the top secret secured by armed guards button up in the control room behind lock and key.
Optical data diodes are wide spread and they need more, but journalists will report successful attacks on the insecure side as being as dangerous as a hit on the secure side, and mgmt loves to write insecure side monitoring as procedure, making the ops react as if the primary coolant system just broke, even though its just some harmless metric gathering webserver.
I know its cheating, but verbally, from talking to people, there is a terror related to changing default passwords that some jack*ss will change the control rod PLC password to "R@$Gfgsdg" and promptly get hit by a bus moments before someone needs to log in and change something off shift, and now no one can shut down the reactor without a seance. Well not literally, but close. If you have physical air gapped gear if you change the default password the only possible effect is slowing down emergency response, there shouldn't even be a password on a router console port, etc.
[+] [-] utefan001|10 years ago|reply
We need to teach our children to understand that only a fool would put critical systems on the public internet. Only a fool would forget to implement account lockout rules or forget login rate limiting (edit) --where approprate--. Only a fool would create software with built in default usernames and passwords. Hello world, the "admin" username does not have to be admin or administrator. Computer security today is a joke. Almost a total illusion.
We need to pay less attention to Kim Kardashashen and more attention to HD Moore.
http://jklossner.com/computerworld/images/security26.gif
[+] [-] Osaka|10 years ago|reply
The main issue comes from the long life cycle of equipment and companies not wanting to change to new securer methods due to fear of costs implementing it.
[+] [-] gozo|10 years ago|reply
How many times do we see new web frameworks on HN without even a mention of security?
[+] [-] at-fates-hands|10 years ago|reply
Here's an article that was published in 2010 and talks about the risks of an attack on the power grid: https://www.nae.edu/Publications/Bridge/TheElectricityGrid/1...
Some interesting takeaways
Recent media reports, in April 2009, for example, highlighted penetrations of the U.S. electricity system by hackers. In November 2009, 60 Minutes aired a piece confirming rumors of break-ins to the Brazilian energy system in 2005 and 2007. The Nuclear Regulatory Commission confirmed that in January 2003, the Microsoft SQL Server worm known as “Slammer” infected a private computer network at the Davis-Besse nuclear power plant in Oak Harbor, Ohio, and disabled a safety monitoring system for nearly five hours. Fortunately the plant was off-line at the time. In January 2008, the Central Intelligence Agency reported knowledge of four disruptions, or threatened disruptions, by hackers of the power supplies for four cities.
Clearly, these attacks have been ongoing for a while. NAE also points out keeping software updated can lag far behind the threats:
Another problem today is that security patches are sometimes not supplied to end-users, or they are supplied but are not applied for fear of impacting system performance. Current practice is to apply an upgrade/patch only after SCADA vendors have thoroughly tested and validated it, which can sometimes take several months.
[+] [-] yk|10 years ago|reply
Seriously, when was the last time that someone knowledgeable used the term "cyber?"
[+] [-] jb613|10 years ago|reply
The Eskimo/Inuit have over 50 words for "snow", in today's world, we will continue to add words to better describe our digital world.
PS - cyber, cyber, cyber, cyber, cyber. Get used to it.
[+] [-] timoth|10 years ago|reply
[+] [-] maze-le|10 years ago|reply
[+] [-] utefan001|10 years ago|reply
[+] [-] loaaa|10 years ago|reply
[+] [-] TheCapn|10 years ago|reply
But speaking from experience in the controls industry its the people who write the cheques. I don't work in systems like a nuclear power plant but the critical systems I help design are absolutely air gapped and secured from all basic user access. The business managers who want to remotely administer their system or check the data logs are the ones that introduce components susceptible to attack and hopefully integrators learn to stand their ground.
Another issue I see frequently, and not to discredit the brilliant engineers I work with, but there's too many non-software people in the software field. Too many electrical oriented people I work with figure software is just a slap on system that interfaces the user to the motor but fail to realize all the pitfalls it creates if not done in as rigorous of a manner as the electrical design.
[+] [-] idlewords|10 years ago|reply
[+] [-] gloves|10 years ago|reply
Dangerous system created > People Look to break into dangerous system for personal gain > Result: Danger
[+] [-] VLM|10 years ago|reply
The interesting parts are not being covered, aside from the reported grubbing for money. I skimmed the 50+ page report and its very unusual to have IT security staff on site 24x7. The infosec folks don't share incident reports like the nuclear physics community always has (at least in the USA). Nobody does drills where they assume the computers are powned so go manual/verbal. IDS systems are not usually deployed. Patching fixes security holes which are not tracked but results in downed systems and uptime hits which are tracked so you get one guess as to the priority of patching. The IT supply chain is not managed to the military aerospace level of examination that, say, welding gear is managed to at the plants.
The journalist reports of problems are bogus. However, the actual report pages 14 thru 17 were pretty interesting reading. The PLC at Browns Ferry is a typical story, they accidentally DOS'd the VFD controller for a circulation pump, so the other eighty billion procedures to protect the plant kicked in and they shut the plant down, an intentional attack would have had the same result. The Hatch story is a good example of just why plant operators hate patches, a poorly applied patch shut down the plant for days due to a SCADA misreading, if they didn't patch the plant would not have been shut down (or maybe it would have been powned later?). Somewhat instructive story about the Korea plant that got their HR database completely powned and its treated as a "nuclear plant attack" even though it was just boring HR pownership like could happen to a food store or something.
The report explains in great detail how toxic security is currently implemented, where the nuclear engineers set everything up, and once it all works, as the last step, the infosec guys try to sprinkle magic security pixie dust and checkboxes, and they try to explain in manager language why thats possibly the dumbest possible way to build a secure system, its really pretty well written. Around page 31 of the report.
Cloudy Virtualization confuses people, both nuke and infosec, so you'll have some PLC in a janitors closet but put the top secret secured by armed guards button up in the control room behind lock and key.
Optical data diodes are wide spread and they need more, but journalists will report successful attacks on the insecure side as being as dangerous as a hit on the secure side, and mgmt loves to write insecure side monitoring as procedure, making the ops react as if the primary coolant system just broke, even though its just some harmless metric gathering webserver.
I know its cheating, but verbally, from talking to people, there is a terror related to changing default passwords that some jack*ss will change the control rod PLC password to "R@$Gfgsdg" and promptly get hit by a bus moments before someone needs to log in and change something off shift, and now no one can shut down the reactor without a seance. Well not literally, but close. If you have physical air gapped gear if you change the default password the only possible effect is slowing down emergency response, there shouldn't even be a password on a router console port, etc.
[+] [-] nitrogen|10 years ago|reply
[+] [-] transfire|10 years ago|reply
Nuclear Power Plant Rule #2: Whomever put nuclear power plant on-line is to be fired immediately.
[+] [-] archgoon|10 years ago|reply