Mission critical software is created by processes, not individuals - unlike, say, surgery or legal counsel, that are delivered in person.
A single programmer shouldn't be able to kill using his keyboard, and if he can, something is very wrong with the way that mission critical software is produced.
To maintain the analogy, if a law firm is using the services of paralegals that haven't passed the bar exam or have no formal training, I will not be concerned even if they handle most of the legwork in my case, because I assume they are part of a team and their work is well scrutinized.
> Mission critical software is created by processes, not individuals.
I worked on software for the military, which was under a process (heavily regimented, I might add), and yet US soldiers still died when the software failed. It wasn't one person with a keyboard, it was a group decision and politics that allowed it to happen.
Processes in the end are still run by people. Sometimes people do the thing they think is right, yet end up with a result which is very very wrong.
Software that could kill is a small subset of overall software being written, but software that can arguably ruin or at least cause moderate havoc on people's lives (via PII) encompasses maybe a majority of jobs in the software industry [citation needed].
I don't know that having a "bar exam" is the best way to approach that problem, either. I think laws need to be written that cripple companies that don't follow best security practices and the rest will largely follow.
Maybe that eventually results in a sort of "bar exam" that companies endorse in order to cover their asses, but what are the chances that it will end up being a positive thing for programmers and not a bureaucratic nightmare test that everybody knows is bs?
A large amount (a majority maybe even?) of software written outside of Silicon Valley is "boring" business logic-y stuff that has a very limited ability to impact anyone's lives in a meaningful way.
I haven't thought too deeply about this but the solution to the PII exposure problem, in my opinion, is to heavily disincentivize entities from holding any data that they don't absolutely need to (probably via punishingly them heavily for slipping up).
The vast majority of software written today has precisely 0 ability to kill anyone.
Even IF you are working at a company doing one of those things, only a subset of engineers that are working there are going to working on the "core" part of the software. Those companies have lots of web developers making code that isn't dangerous as well.
To put it another way, reading and writing can kill as well. If the person writing the airplane manual screws up, it could kill someone. But does that mean we need a bar exam of writing English?
> The vast majority of software written today has precisely 0 ability to kill anyone.
There's lots of ways software can cost corporations money if it fails to operate. The recent failures at Delta, Southwest, United, all come to mind. The IT industry has a myriad of certifications to, say, prevent some tech-idiot from touching the routing table on the routers.
The ISO has standards for software engineering (similar to ISO9001 for hardware) but so far the market does not feel that standard provides a competitive advantage. If I could guarantee my processes gave you 99.9999% uptime with our software, would you prefer that or another company that can iterate faster?
OP is clearly talking about a general exam for all software programmers. ("What prevents a general examination to be designed and formulated for software programming work?")
Software engineering for safety-critical fields like aerospace and automotive tends to be fairly regulated already.
I would think that accountibility would probably be equally if not more helpful in these situations. If a document was signed stating liability similar to Sarbanes Oxley for CFOs those who would be cutting corners would likely do the thorough vetting of credentials themselves.
cornholio|8 years ago
A single programmer shouldn't be able to kill using his keyboard, and if he can, something is very wrong with the way that mission critical software is produced.
To maintain the analogy, if a law firm is using the services of paralegals that haven't passed the bar exam or have no formal training, I will not be concerned even if they handle most of the legwork in my case, because I assume they are part of a team and their work is well scrutinized.
bb88|8 years ago
I worked on software for the military, which was under a process (heavily regimented, I might add), and yet US soldiers still died when the software failed. It wasn't one person with a keyboard, it was a group decision and politics that allowed it to happen.
Processes in the end are still run by people. Sometimes people do the thing they think is right, yet end up with a result which is very very wrong.
allemagne|8 years ago
I don't know that having a "bar exam" is the best way to approach that problem, either. I think laws need to be written that cripple companies that don't follow best security practices and the rest will largely follow.
Maybe that eventually results in a sort of "bar exam" that companies endorse in order to cover their asses, but what are the chances that it will end up being a positive thing for programmers and not a bureaucratic nightmare test that everybody knows is bs?
platinumrad|8 years ago
I haven't thought too deeply about this but the solution to the PII exposure problem, in my opinion, is to heavily disincentivize entities from holding any data that they don't absolutely need to (probably via punishingly them heavily for slipping up).
stale2002|8 years ago
The vast majority of software written today has precisely 0 ability to kill anyone.
Even IF you are working at a company doing one of those things, only a subset of engineers that are working there are going to working on the "core" part of the software. Those companies have lots of web developers making code that isn't dangerous as well.
To put it another way, reading and writing can kill as well. If the person writing the airplane manual screws up, it could kill someone. But does that mean we need a bar exam of writing English?
bb88|8 years ago
There's lots of ways software can cost corporations money if it fails to operate. The recent failures at Delta, Southwest, United, all come to mind. The IT industry has a myriad of certifications to, say, prevent some tech-idiot from touching the routing table on the routers.
The ISO has standards for software engineering (similar to ISO9001 for hardware) but so far the market does not feel that standard provides a competitive advantage. If I could guarantee my processes gave you 99.9999% uptime with our software, would you prefer that or another company that can iterate faster?
platinumrad|8 years ago
Software engineering for safety-critical fields like aerospace and automotive tends to be fairly regulated already.
awat|8 years ago