This was written in 1999, I wonder if his opinion on software engineering practices have changed. I've never worked in industry, but based on what I've read there are some pretty good processes that are supposed to have some empirical support, just not many people adopt them. Like I read a paper from 2002 in which basically no companies would consistently use good processes.
Most of the material in my Software Engineering course is outdated, but I imagine most companies still try to get away with minimum security and slack on good software engineering practices, like code reviews and static analysis and thorough security requirements, or what have you. Using these processes in general tends to force simpler systems. Iterative development with clear thought out implementation per iteration etc.
It's worth remembering in recent times many companies that write software are not software companies and ones that are don't focus on the quality of software but rather the amount of revenue that can be derived from it.
The only industries that have good software practices are those that are regulated heavily enough for it to matter, see here medical, aviation, military etc.
Unfortunately the vast majority of software is not regulated in this way despite handling data that is potentially just as sensitive from a personal or even financial perspective.
At the end of the day most companies are motivated to produce software quickly and quality is second unless it's causing a reduction in velocity, even then this is simply treated as a debt that is paid down periodically to maintain velocity.
There are exceptions for things like banks that rely on the quality of software to avoid losing money, but we are talking in general here.
A brilliant example of what happens when you have sloppy security/bad incentives AND you underestimate the damage a breach would cause is the Ashley Madison situation.
Many companies are in the same situation and have just been lucky enough to avoid a catastrophic data breach.
Sadly velocity is going to be valued over quality (which in turn means security) for the foreseeable future.
Absentinsomniac|10 years ago
Most of the material in my Software Engineering course is outdated, but I imagine most companies still try to get away with minimum security and slack on good software engineering practices, like code reviews and static analysis and thorough security requirements, or what have you. Using these processes in general tends to force simpler systems. Iterative development with clear thought out implementation per iteration etc.
jpgvm|10 years ago
The only industries that have good software practices are those that are regulated heavily enough for it to matter, see here medical, aviation, military etc.
Unfortunately the vast majority of software is not regulated in this way despite handling data that is potentially just as sensitive from a personal or even financial perspective.
At the end of the day most companies are motivated to produce software quickly and quality is second unless it's causing a reduction in velocity, even then this is simply treated as a debt that is paid down periodically to maintain velocity.
There are exceptions for things like banks that rely on the quality of software to avoid losing money, but we are talking in general here.
A brilliant example of what happens when you have sloppy security/bad incentives AND you underestimate the damage a breach would cause is the Ashley Madison situation. Many companies are in the same situation and have just been lucky enough to avoid a catastrophic data breach.
Sadly velocity is going to be valued over quality (which in turn means security) for the foreseeable future.