Every one of these complaints always misses the salient failure of the analogy: do you expect your bridge or your gas line or your house's electrical layout (once it's installed) to be changed every month or quarter to handle new load, produce different results, or even change its feature set? No - these things are planned for and built to do exactly and only the thing they were first designed for. Heck, you can even throw certain software into that mix like medical device firmware. The software industry is doing exactly what people want out of it: becoming highly flexible to individual business and consumer needs in sophisticated customized interactions.
Those of us who write it are trying as hard as we can to impose sensible and reusable structure on things, but the public wants constant innovation. Not the same "heat my house when I plus this in" demand they had 50 years ago that would have given us the stability to focus on standards, safety, and efficiency. We're doing the best we can.
Hmmm, I think TFA misses the fact that software engineering, isn't really engineering any more. We don't develop code that gets deployed and stays in place (well, maybe in embedded systems, but even those are getting IoT'ed).
Modern software is a service: it gets updated all the time, it's "hired to do a job" rather than perform a specific function. The latter is finite, controllable and could potentially be certified as an engineering product. The former relies on professional codes (and yes liability as well) to make sure that the service is acceptable.
All that is to say that the proper metaphor isn't licensing as for civil engineering or telco networks but akin to licensing for professional services, like medicine, accounting or law.
Which means different tiers of service, different expectations and yes much different compensation/service fees. But to get there we have to stop talking about 'engineering' as the correct framework. We're closer to lawyers or nurses...
I understand where you are coming from, but this is not completely true. We often build physical infrastructure & buildings that need to evolve and change over time. This sometimes leads to over-building and often over-engineering but the biggest difference is we don't usually design & build software to true engineering standards. We move quick and break stuff, fix it after release, waterfall is outdated and maligned, nobody is held responsible for catastrophic failure.
Most of us are NOT engineers. Even less build software as engineers.
Er, no. The public wants software that “just works”. Everybody hates the new normal of eternal beta, or when their familiar and working software is changed to look or function differently, more often than not also removing existing conveniences and features.
Aren't the gas and electrical lines akin to some core capabilities of the system? I have yet to find a (web project) that doesn't need auth, eventing, logging, etc. I don't expect those to change how they work frequently, just line the guts of the building.
To keep with the metaphor, I DO expect to plug an appliance in, though. Or that a room could be used for another purpose. Obviously changing from an office to a commercial kitchen is a major change but if that's what somebody wants to spend their money to do...
So I guess my thought is, "software engineering" and "engineering" are very broad. I actually think software challenges are not so unique. My experience has been that many people lack intuition about the software in question. Most people live in a building of some kind and if they asked for only Brawndo in the water pipes, it would be easy to convince them of how short sighted that is (or they wouldn't even ask). With software I feel like there's still some expectation of magic and zero friction just because it doesn't take a jackhammer to move a wall.
As somebody who switched careers from construction to software development, I disagree with your criticism. Systems in a house are largely decoupled and easily extensible and modular. It's a routing affair to add a room or second floor. You can replace all the plumbing in a house without touching the electrical. You can easily upgrade an electrical service and add more circuits and outlets to an old house. The nature of stud framing makes it relatively trivial to run ethernet cable through a 70 year old home, or replace ancient single pane windows with the latest triple pane offering. You can swap out your dumb lightswitch for the latest wifi connected smart home nonsense, all without disturbing anything else in the house.
I also disagree that "we're doing the best we can." We're building monstrously complex systems because it pads our resumes, not because it actually solves real problems.
> do you expect your bridge or your gas line or your house's electrical layout (once it's installed) to be changed every month or quarter to handle new load, produce different results, or even change its feature set?
More than that, the reason most tech companies don't have a salaried plumber is because most tech companies don't rely on a plumber to try to increase the revenue of their core business model.
Really nice to see comments like this, often HN feels like software engineers attacking other software engineers with zero empathy for the fact that they're in the exact same position.
Seems less true to me than "VCs, entrepeneurs, middle management and MBAs want constant innovation", which is to say "capitalists want constant innovation". The constant churn is a sales-driven impulse to capture more of the market, and it's organizations throwing endless (and mostly pointless) change in order to move a few more units or renew a few more subscriptions. It's not the consumers who think software isn't finished, it's the people selling it.
Think about Office 97 and what you do today. Has anything added to MS Office since then mattered at a wide scale?
Most engineers don't work on static systems. They work supporting things like manufacturing plants, electrical plants, and working on infrastructure which degrades and requires repairs over time. Engineers aren't only involved in the creational part. To me, the software industry is looking more like professional engineering over time and less like a science research project.
People assume software is just like engineering a bridge, but it isn't. Software changes all the time, due to market needs, due to attacks from the outside or inside, due to legal changes, due to executives who want to be promoted, due to technological changes, due to etc, etc. Bridges do not change, as gravity and wind and temperature and chemistry and physics are in the real world while our software exists in a world of our making, generally built in myriads of layers, often from multiple vendors, and have to interoperate with myriads more other worlds that are also continuously changing over networks which we may not have any control of.
Since I started in the early 80's every generation thinks they can standardize things, license programming, make things from perfect reusable parts that work perfectly, and generally turn programming into recipes. Every generation fails.
In reality software is a mess because none of those things are possible because complexity is inherent in what we do, and what is expected of us, and there has never been any way to satisfy everything we are asked to do with some magical silver bullshit, err, bullet.
Even if you wanted software "Professional Engineers" to be a thing, how would you even do that, with 100's of programming languages, operating systems, software environments and industries all with different needs, and in an industry that changes every single day. Bridges have been built for thousands of years, software has changed radically and continuously since I started in 1981. As soon as you defined some standard to test against, it would be obsolete.
I could complain about software endlessly and I came here to do it, but the comments discussing the analogy breakdown between software and bridge engineering made me wonder - is bridge engineering really so staid and rote? I've seen some beautiful bridges built in the last few decades and it makes me wonder if they get to play around with design as well.
The author's analogy is bad. On the software side, he's talking mainly about security. On the other side, he's talking mainly about build quality. But those two things are not really the same. No homebuilder pays liability if a criminal breaks into your house and steals your stuff. That's security and has nothing to do with build quality. If someone steals your car, does the auto manufacturer pay liability? No.
Computer security is definitely a problem. But this argument by analogy simply doesn't work. Moreover, non-computer security is a problem too. (School shootings, anyone?) Anyway, talking about toilets and whether they work isn't helpful at all in this case.
On the one hand, the problem he discusses is entirely real, and he does not exaggerate the scale of the issue.
It is much like medicine in the time of the "four humours". The state of medicine then was a real problem, but if you were to have instituted licensing and professionalisation requirements at that point, it would largely freeze into place a bad situation. The other industries with licensing, have established a good, safe way of doing things. Software has no good, safe way of doing things to establish as code, and teach to new practitioners. I do not exaggerate, either.
On the other hand, what he suggests as a solution, would require that innovation be slowed by at least one, probably two orders of magnitude. Any nation which did _not_ do this, would quickly sprint ahead of those who did, and they would quickly be able to overwhelm the capabilities of the nations who had professional licensing requirements.
The author makes a decent case that some sort of licensing should exist for some types of software development at some point in the future. But the devil is in the details, and the merit of the licensing process needs to be proven.
A lot of regulators are valuable but we've also got a lot running around with red tape and power trips that they didn't earn. We're extracting thousands of dollars from low-income minorities for hairstylist licenses that benefit no one. We've got engineering boards punishing people for publicly (and correctly) disputing the math of red light timing. There's too much abuse.
Regulation isn't just one way. Regulators need to be held accountable too.
Can't agree more. I've been saying for years that we should have some professionalism added to the industry.
It doesn't mean that everyone who writes code must be licensed. But to determine if software is fit for purpose you must be. Liability is important and software doesn't have to threaten life and limb to benefit from it. Many engineering professions also protect property and business interests (along with life and limb).
What I hope such a system would provide is a means to prevent companies from cutting corners for the sake of profits and give engineers the right to say what is fit for purpose. Too many breaches that have cost economies too much money happen because IT is incentivized to release now and fix it later. Then hundreds of thousands of peoples' financial details are leaked or their pensions disappear. And the current system of liabilities don't protect the end users or dissuade the software industry from continuing on this destructive course.
We also don't account for externalities well. Consider how many years PoW crypto mining has been going on, growing, and forcing new coal plants to open to keep up with demand. People in rich countries don't care because it's not happening in their backyard and it is promising to make some of them rich. That happened because we have no social safeguards against bad technology harming the environment/society/etc. You can write a software service that exists to use up as much energy as possible and you will never face any consequences for the damage caused.
We barely slap people on the wrist for organized crime let alone corporate crime, negligence, etc.
We're not talking about your high school web project here. We don't have to prevent people from learning and building things on their own. I just think we need to prevent companies from rolling the dice with our future and let the folks who know what they're doing run the show.
I recently got professionally qualified as a software engineer with a capital E. I'm still the same regular programmer I ever was! Not sure a piece of paper really solves anything.
> What I hope such a system would provide is a means to prevent companies from cutting corners for the sake of profits and give engineers the right to say what is fit for purpose. Too many breaches that have cost economies too much money happen because IT is incentivized to release now and fix it later.
I find that, at companies that tend to beat market expectations it's already like that. You can always tell whether they consider engineering a cost center or a core part of the business. It's no accident that the CEOs of the majority of fortunes 5 are engineers...
The headline contradicts the conclusion. The software industry offers the exact professionally licensed engineers who are subject to the liabilities spoken of. In reality, is the professional engineering bodies who have not given reason for buyers to want to pay extra for those people. I'm not sure this article improves on that situation, only offering some fuzzy notion that bad things could happen to a business organization, much the same as what will happen if budgets are stretched paying for PEs.
"The pretence that corporations are necessary for the better government of the trade, is without any foundation. The real and effectual discipline which is exercised over a workman, is not that of his corporation, but that of his customers. It is the fear of losing their employment which restrains his frauds and corrects his negligence. An exclusive corporation necessarily weakens the force of this discipline. A particular set of workmen must then be employed, let them behave well or ill. It is upon this account that, in many large incorporated towns, no tolerable workmen are to be found, even in some of the most necessary trades. If you would have your work tolerably executed, it must be done in the suburbs, where the workmen, having no exclusive privilege, have nothing but their character to depend upon, and you must then smuggle it into the town as well as you can."
>The good news is the ransomware attack on Colonial Pipeline in May 2021 probably marks the beginning of the end. Comforting as that might sound, it tells us very little about how that ending will turn out.
I wonder what level of liability would be reasonable when considering the increasing sophistication of these types of hacks. Going back to the bridge analogy, we would hold the licensed engineer responsible if the bridge they built collapsed because of a thunderstorm, but not if the bridge collapsed because a terrorist or rival nation state bombed it. Which of those situations is more analogous to getting hit with ransomeware or some other type of hack?
Say a piece of software from a small company is compromised, should the liability be different if the attacker is some script kiddie versus if it's some hacking group that likely has ties to a foreign intelligence service? Is it reasonable to expect every software company to be able to fend off even the most well-financed attack?
- On one hand, professional liability would indeed be a good thing for overall reliability of our software systems. A "code" for offering software products to citizens would be a good thing. The GDPR is a good step in the right direction for privacy concerns, but there are many other types of concerns that could be covered.
- On the other hand, most of the "real" problems in infrastructure software come from adversarial action. Most software systems run more or less fine by themselves, from a professional liability standpoint. Adversarial action is not usually covered by "professional liability". For example, nobody blames the architects of the Azovstal plant that the steel furnaces couldn't stand up to sustained artillery bombardment.
- Despite all the talk about how critical software systems are these days, 99% of systems out there is the equivalent of a garden shed and nobody would care if it falls over. Certainly the shed in my own garden would not survive a proper hurricane, but that is OK because a. we don't really get hurricanes here and b. the cost to completely rebuild it would be much lower than bringing it up to hurricane standards. Even Maersk, which is surely not a small target, managed to survive a full blown ransomware hit just fine. (By which I mean, they took a fairly large hit financially but nobody died and no ships sank. By and large all vessels even reached their destination on time) Their IT had to work a lot of overtime for sure, but "the system" is not as vulnerable as people like to claim.
So what to do? I would not mind some form of legally mandated checklist to follow that would be mandatory for any company offering services on the internet, but at the same time I don't think it would be very effective in countering the worst threats and I also don't think it would be proportionate for most companies.
The shed's a strawman. How do we handle hurricane damage and other "acts of god?" Not in court, but with insurers. Do insurers currently insure properties with sheds? Yes. Do they insure shabby sheds against hurricanes? Maybe - it's up to them and I'm sure they can figure out if the odds would pay off for them.
If your shed exploded into millions of shards of glass when struck by high wind that would be a different story. Or if it spread toxic chemicals all over your neighborhood. Or if it caught fire easily and the fire could spread to other buildings. Who would have thought sheds would make such wonderful metaphors?
>The time is way overdue for IT engineers to be subject to professional liability, like almost every other engineering profession. Before you tell me that is impossible, please study how the very same thing happened with electricity, planes, cranes, trains, ships, automobiles, lifts, food processing, buildings, and, for that matter, driving a car.
How will that solve the problem? Even if we were able to agree on a profession, that people could read and sign on to, would it help?
Everything is build on sand. There are zero operating systems that can stand for a year without patching in the face of contact with the internet.
Blaming system administrators, programmers, or users isn't going to help fix the fundamental design flaw at the root of our software.
We need operating systems that are provably secure upon which to build the rest. Then we can apply the principle of least privilege, and a series of security policies to build upon that.
In the mean while, there is one fundamental tool which is under utilized, the data diode.[1] With such a device, it would be possible to monitor a system from the outside world (via the internet, etc.). A polling loop sends data out in a continuous manner (with forward error correction), and a matching server receives the data, checks for errors, an then makes it available to the outside world.
You can also use data diodes to allow submission of information from the world, without the danger of exfiltration. Such a system might have helped prevent the OPM breach of 2015 if the database were only allowed to be added, and never queried via network.[2]
> There are zero operating systems that can stand for a year without patching in the face of contact with the internet.
Similarly, there are also zero buildings, bridges, aircraft, etc. that can realistically withstand attacks of humans despite being built by engineers who have taken on the liability. A sufficiently powerful bomb will win every time. It seems a bit strange to want to hold software to a higher standard here.
> In the mean while, there is one fundamental tool which is under utilized, the data diode.[1] With such a device, it would be possible to monitor a system from the outside world (via the internet, etc.). A polling loop sends data out in a continuous manner (with forward error correction), and a matching server receives the data, checks for errors, an then makes it available to the outside world.
Something like that was implemented at Amazon in the early days using a modified serial cable that prevented credit card information from ever exiting the payment system even if the web servers were compromised.
Traditional engineering can be done by unlicensed persons. It's just that the final design must be signed off by a licensed engineer. In the same way, open source software can be contributed to by unlicensed software engineers, but if the system will be used in a production product, it must be signed off by a licensed software engineer.
> If any science fiction author, famous or obscure, had submitted a story where the plot was "modern IT is a bunch of crap that organized crime exploits for extortion," it would have gotten nowhere
Has Kamp ever read any SciFi published after the '70s? That's a pretty common trope in cyberpunk and subsequent works. Heck, it's a staple of Mr. Robot, hardly a niche production.
Standardization of threads started between private companies around the year 1800, and continued on mixed in with a bunch of other "competing" private company standards. Sick of the lack of a unified standard, William Sellers submitted a paper in 1864 proposing one standard based on an already popular form. Eventually the USG adopted this as a requirement in government contracts, was adopted as United States Standard thread, and later Unified Thread Standard. The standard is maintained by the American Society of Mechanical Engineers and American National Standards Institute.
- instruction sets (e.g. x86, ARM)
- 8 bit bytes (though there are two byte orders)
- floating point formats and operations
- programming languages (Java, JavaScript, C++...)
- standard libraries (Java SE, libc, stl...)
- OS APIs (Win32, POSIX...)
- network protocols (HTTP, TLS, TCP/IP, BGP...)
- data formats (HTML, XML, JSON, MIME, protobufs...)
However the design complexity of software, with potentially dozens of levels of abstractions and millions of lines of code, isn't greatly reduced by using these standard nuts and bolts.
We're currently in the stage when we create all of our own bespoke connectors - glue code. RESTful interfaces are an attempt at solving part of the problem but they still require a lot of hand fiddling to make them work.
While mechanical connectors have existed for millennia, standardized connectors were a set of fairly recent innovations from the industrial revolution [1], with notable standardizations like unified thread as recent as post-WWII. So they're both new and took time for all of the standardizations to settle to what we have now. And the state of mechanical connectors is not static. There is constant innovation. Consider medical, military, and other applications with special demands.
Seen in that light, software is very rapidly standardizing. Like everyone else here, I feel your pain, though.
Those of us experienced in DevOps have been trying to set standards for repo/project structure, deployments, etc. for quite a while, but it seems like every developer has their own idea of what they did the first time they read a blog post about "how to build X app" and then will use a different structure after they consume their next Medium article. Or, even better, they're been "doing it this way for 10 years", or "I heard Facebook/Amazon, etc. does X".
Because we're still stumbling on what is and isn't a "bad practice" while companies continue to gaslight the developer population and a few consultants and spokesmen have endless discussions on what should and shouldn't happen.
Ergo, there's no incentive to fix it, and every incentive to keep it broken.
Most software doesn’t kill people or cause any other type of financial/physical damage if it falls over. So regulating all software is a bad idea. However I 100% believe that all software that is critical to saving lives/avoiding damage should be formally proven correct using proof assistants. The people who are qualified to do this deserve to be licensed Software Engineers. People who can’t do this should not be called Software Engineers and should not be legally allowed to work on safety critical software.
> The time is way overdue for IT engineers to be subject to professional liability, like almost every other engineering profession. Before you tell me that is impossible, please study how the very same thing happened with electricity, planes, cranes, trains, ships, automobiles, lifts, food processing, buildings, and, for that matter, driving a car.
But the people responsible for the Colonial Pipeline are not Software Engineers. SWEs are just workers. Why don't we start to hold the people accountable who extract wealth and give orders? Power and responsibility should go hand in hand.
SWEs _want_ to engineer, make things more performant and secure. They _want_ to say No to unnecessary features. This takes time, trust, money. What about the owners? The board? The executives? They make the decisions, want fast growth, more features, less expenses, software to be a commodity and engineers be interchangeable. Managers and sales people, who buy and sell software packages with fancy names on them to get a promotion, don't do the soul sucking work of integrating with a buzzword driven hodgepodge. Shouldn't they be held accountable first?
With that out of the way...
The example in the article is the Colonial Pipeline ransomware attack. A cyber _attack_. Are bridges, toilets and buildings generally built to withstand arbitrary hostile attacks?
But Lack of understanding of how decisions in the IT industry are made, ridiculous examples and bad analogies aside: I still think there is a good point hidden in there somewhere. Namely that good engineering takes time, investment, auditing, standard processes and a strong technological foundation. But we are not there yet. Nobody is laying the foundation to get us there. It's just iterations upon iterations and layers upon layers, features upon features.
Before we can even start to think about licensing SWEs, we need to start thinking _very_ long term. Much of the SWE effort is funneled into short term monetary gains, while fundamental technologies like OS's common file formats, browsers, firmware and all that stuff is rarely reconsidered and just patched over continuously. The foundations are _inherently_ insecure and the mindset is wrong. Familiarity and productivity often take precedence over simplicity and robustness. Everyone thinks they are entitled to telemetry and other user data and most software is closed source. How do you reason about black boxes? How do you trust them?
Also who will do the licensing? The people who tend to push for these things while shitting on other software practitioners don't inspire much confidence IMO.
[+] [-] lr4444lr|3 years ago|reply
Those of us who write it are trying as hard as we can to impose sensible and reusable structure on things, but the public wants constant innovation. Not the same "heat my house when I plus this in" demand they had 50 years ago that would have given us the stability to focus on standards, safety, and efficiency. We're doing the best we can.
[+] [-] cm277|3 years ago|reply
Modern software is a service: it gets updated all the time, it's "hired to do a job" rather than perform a specific function. The latter is finite, controllable and could potentially be certified as an engineering product. The former relies on professional codes (and yes liability as well) to make sure that the service is acceptable.
All that is to say that the proper metaphor isn't licensing as for civil engineering or telco networks but akin to licensing for professional services, like medicine, accounting or law.
Which means different tiers of service, different expectations and yes much different compensation/service fees. But to get there we have to stop talking about 'engineering' as the correct framework. We're closer to lawyers or nurses...
[+] [-] skeeter2020|3 years ago|reply
Most of us are NOT engineers. Even less build software as engineers.
[+] [-] layer8|3 years ago|reply
Er, no. The public wants software that “just works”. Everybody hates the new normal of eternal beta, or when their familiar and working software is changed to look or function differently, more often than not also removing existing conveniences and features.
[+] [-] atmartins|3 years ago|reply
To keep with the metaphor, I DO expect to plug an appliance in, though. Or that a room could be used for another purpose. Obviously changing from an office to a commercial kitchen is a major change but if that's what somebody wants to spend their money to do...
So I guess my thought is, "software engineering" and "engineering" are very broad. I actually think software challenges are not so unique. My experience has been that many people lack intuition about the software in question. Most people live in a building of some kind and if they asked for only Brawndo in the water pipes, it would be easy to convince them of how short sighted that is (or they wouldn't even ask). With software I feel like there's still some expectation of magic and zero friction just because it doesn't take a jackhammer to move a wall.
[+] [-] patrick451|3 years ago|reply
I also disagree that "we're doing the best we can." We're building monstrously complex systems because it pads our resumes, not because it actually solves real problems.
[+] [-] Supermancho|3 years ago|reply
More than that, the reason most tech companies don't have a salaried plumber is because most tech companies don't rely on a plumber to try to increase the revenue of their core business model.
[+] [-] brightball|3 years ago|reply
At the same time, there’s also no other industry that operates at a fraction of the speed of this one.
[+] [-] bodge5000|3 years ago|reply
[+] [-] fatbird|3 years ago|reply
the public wants constant innovation
Seems less true to me than "VCs, entrepeneurs, middle management and MBAs want constant innovation", which is to say "capitalists want constant innovation". The constant churn is a sales-driven impulse to capture more of the market, and it's organizations throwing endless (and mostly pointless) change in order to move a few more units or renew a few more subscriptions. It's not the consumers who think software isn't finished, it's the people selling it.
Think about Office 97 and what you do today. Has anything added to MS Office since then mattered at a wide scale?
[+] [-] DragonStrength|3 years ago|reply
[+] [-] coldcode|3 years ago|reply
Since I started in the early 80's every generation thinks they can standardize things, license programming, make things from perfect reusable parts that work perfectly, and generally turn programming into recipes. Every generation fails.
In reality software is a mess because none of those things are possible because complexity is inherent in what we do, and what is expected of us, and there has never been any way to satisfy everything we are asked to do with some magical silver bullshit, err, bullet.
Even if you wanted software "Professional Engineers" to be a thing, how would you even do that, with 100's of programming languages, operating systems, software environments and industries all with different needs, and in an industry that changes every single day. Bridges have been built for thousands of years, software has changed radically and continuously since I started in 1981. As soon as you defined some standard to test against, it would be obsolete.
[+] [-] hotpotamus|3 years ago|reply
[+] [-] lapcat|3 years ago|reply
Computer security is definitely a problem. But this argument by analogy simply doesn't work. Moreover, non-computer security is a problem too. (School shootings, anyone?) Anyway, talking about toilets and whether they work isn't helpful at all in this case.
[+] [-] rossdavidh|3 years ago|reply
It is much like medicine in the time of the "four humours". The state of medicine then was a real problem, but if you were to have instituted licensing and professionalisation requirements at that point, it would largely freeze into place a bad situation. The other industries with licensing, have established a good, safe way of doing things. Software has no good, safe way of doing things to establish as code, and teach to new practitioners. I do not exaggerate, either.
On the other hand, what he suggests as a solution, would require that innovation be slowed by at least one, probably two orders of magnitude. Any nation which did _not_ do this, would quickly sprint ahead of those who did, and they would quickly be able to overwhelm the capabilities of the nations who had professional licensing requirements.
[+] [-] petermcneeley|3 years ago|reply
[+] [-] civilized|3 years ago|reply
A lot of regulators are valuable but we've also got a lot running around with red tape and power trips that they didn't earn. We're extracting thousands of dollars from low-income minorities for hairstylist licenses that benefit no one. We've got engineering boards punishing people for publicly (and correctly) disputing the math of red light timing. There's too much abuse.
Regulation isn't just one way. Regulators need to be held accountable too.
[+] [-] agentultra|3 years ago|reply
It doesn't mean that everyone who writes code must be licensed. But to determine if software is fit for purpose you must be. Liability is important and software doesn't have to threaten life and limb to benefit from it. Many engineering professions also protect property and business interests (along with life and limb).
What I hope such a system would provide is a means to prevent companies from cutting corners for the sake of profits and give engineers the right to say what is fit for purpose. Too many breaches that have cost economies too much money happen because IT is incentivized to release now and fix it later. Then hundreds of thousands of peoples' financial details are leaked or their pensions disappear. And the current system of liabilities don't protect the end users or dissuade the software industry from continuing on this destructive course.
We also don't account for externalities well. Consider how many years PoW crypto mining has been going on, growing, and forcing new coal plants to open to keep up with demand. People in rich countries don't care because it's not happening in their backyard and it is promising to make some of them rich. That happened because we have no social safeguards against bad technology harming the environment/society/etc. You can write a software service that exists to use up as much energy as possible and you will never face any consequences for the damage caused.
We barely slap people on the wrist for organized crime let alone corporate crime, negligence, etc.
We're not talking about your high school web project here. We don't have to prevent people from learning and building things on their own. I just think we need to prevent companies from rolling the dice with our future and let the folks who know what they're doing run the show.
[+] [-] chrisseaton|3 years ago|reply
[+] [-] 908B64B197|3 years ago|reply
I find that, at companies that tend to beat market expectations it's already like that. You can always tell whether they consider engineering a cost center or a core part of the business. It's no accident that the CEOs of the majority of fortunes 5 are engineers...
[+] [-] randomdata|3 years ago|reply
[+] [-] chrisseaton|3 years ago|reply
I thought there wasn't any longer any exam for people to take in software engineering to become professionally licensed in the US?
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] pca006132|3 years ago|reply
[+] [-] shaftoe444|3 years ago|reply
Adam Smith, The Wealth of Nations
[+] [-] spiffytech|3 years ago|reply
https://www.hillelwayne.com/post/are-we-really-engineers/
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] spaced-out|3 years ago|reply
I wonder what level of liability would be reasonable when considering the increasing sophistication of these types of hacks. Going back to the bridge analogy, we would hold the licensed engineer responsible if the bridge they built collapsed because of a thunderstorm, but not if the bridge collapsed because a terrorist or rival nation state bombed it. Which of those situations is more analogous to getting hit with ransomeware or some other type of hack?
Say a piece of software from a small company is compromised, should the liability be different if the attacker is some script kiddie versus if it's some hacking group that likely has ties to a foreign intelligence service? Is it reasonable to expect every software company to be able to fend off even the most well-financed attack?
[+] [-] WJW|3 years ago|reply
- On one hand, professional liability would indeed be a good thing for overall reliability of our software systems. A "code" for offering software products to citizens would be a good thing. The GDPR is a good step in the right direction for privacy concerns, but there are many other types of concerns that could be covered.
- On the other hand, most of the "real" problems in infrastructure software come from adversarial action. Most software systems run more or less fine by themselves, from a professional liability standpoint. Adversarial action is not usually covered by "professional liability". For example, nobody blames the architects of the Azovstal plant that the steel furnaces couldn't stand up to sustained artillery bombardment.
- Despite all the talk about how critical software systems are these days, 99% of systems out there is the equivalent of a garden shed and nobody would care if it falls over. Certainly the shed in my own garden would not survive a proper hurricane, but that is OK because a. we don't really get hurricanes here and b. the cost to completely rebuild it would be much lower than bringing it up to hurricane standards. Even Maersk, which is surely not a small target, managed to survive a full blown ransomware hit just fine. (By which I mean, they took a fairly large hit financially but nobody died and no ships sank. By and large all vessels even reached their destination on time) Their IT had to work a lot of overtime for sure, but "the system" is not as vulnerable as people like to claim.
So what to do? I would not mind some form of legally mandated checklist to follow that would be mandatory for any company offering services on the internet, but at the same time I don't think it would be very effective in countering the worst threats and I also don't think it would be proportionate for most companies.
[+] [-] drewcoo|3 years ago|reply
If your shed exploded into millions of shards of glass when struck by high wind that would be a different story. Or if it spread toxic chemicals all over your neighborhood. Or if it caught fire easily and the fire could spread to other buildings. Who would have thought sheds would make such wonderful metaphors?
[+] [-] mikewarot|3 years ago|reply
How will that solve the problem? Even if we were able to agree on a profession, that people could read and sign on to, would it help?
Everything is build on sand. There are zero operating systems that can stand for a year without patching in the face of contact with the internet.
Blaming system administrators, programmers, or users isn't going to help fix the fundamental design flaw at the root of our software.
We need operating systems that are provably secure upon which to build the rest. Then we can apply the principle of least privilege, and a series of security policies to build upon that.
In the mean while, there is one fundamental tool which is under utilized, the data diode.[1] With such a device, it would be possible to monitor a system from the outside world (via the internet, etc.). A polling loop sends data out in a continuous manner (with forward error correction), and a matching server receives the data, checks for errors, an then makes it available to the outside world.
You can also use data diodes to allow submission of information from the world, without the danger of exfiltration. Such a system might have helped prevent the OPM breach of 2015 if the database were only allowed to be added, and never queried via network.[2]
1 - https://en.wikipedia.org/wiki/Unidirectional_network
2 - https://en.wikipedia.org/wiki/Office_of_Personnel_Management...
[+] [-] randomdata|3 years ago|reply
Similarly, there are also zero buildings, bridges, aircraft, etc. that can realistically withstand attacks of humans despite being built by engineers who have taken on the liability. A sufficiently powerful bomb will win every time. It seems a bit strange to want to hold software to a higher standard here.
[+] [-] webmaven|3 years ago|reply
Something like that was implemented at Amazon in the early days using a modified serial cable that prevented credit card information from ever exiting the payment system even if the web servers were compromised.
[+] [-] RcouF1uZ4gsC|3 years ago|reply
Before contributing you would need to upload your engineering certification credentials.
Good luck getting certified if you are under 18 or do not have a college degree.
Also, because of liability issues, employers may prevent you from contributing to open source.
[+] [-] edgedetector|3 years ago|reply
[+] [-] toyg|3 years ago|reply
Has Kamp ever read any SciFi published after the '70s? That's a pretty common trope in cyberpunk and subsequent works. Heck, it's a staple of Mr. Robot, hardly a niche production.
[+] [-] alaricus|3 years ago|reply
[+] [-] throwaway787544|3 years ago|reply
https://en.m.wikipedia.org/wiki/Screw_thread#History_of_stan...
Get the government to require a standard for all their software and anyone who wants a government contract will have to conform.
[+] [-] musicale|3 years ago|reply
[+] [-] drewcoo|3 years ago|reply
While mechanical connectors have existed for millennia, standardized connectors were a set of fairly recent innovations from the industrial revolution [1], with notable standardizations like unified thread as recent as post-WWII. So they're both new and took time for all of the standardizations to settle to what we have now. And the state of mechanical connectors is not static. There is constant innovation. Consider medical, military, and other applications with special demands.
Seen in that light, software is very rapidly standardizing. Like everyone else here, I feel your pain, though.
[1] https://www.nord-lock.com/insights/knowledge/2017/the-histor...
[+] [-] bokchoi|3 years ago|reply
[+] [-] I_Hate_Devs|3 years ago|reply
[+] [-] eptcyka|3 years ago|reply
[+] [-] hwayne|3 years ago|reply
[+] [-] seclorum_wien|3 years ago|reply
Software is language. It is as slippery as all other human languages when you try to pin it down to absolutes.
[+] [-] BlargMcLarg|3 years ago|reply
Ergo, there's no incentive to fix it, and every incentive to keep it broken.
[+] [-] mbrodersen|3 years ago|reply
[+] [-] flappyeagle|3 years ago|reply
Even security devices like locks and fences are trivially defeated with the right know how and tools.
[+] [-] dgb23|3 years ago|reply
But the people responsible for the Colonial Pipeline are not Software Engineers. SWEs are just workers. Why don't we start to hold the people accountable who extract wealth and give orders? Power and responsibility should go hand in hand.
SWEs _want_ to engineer, make things more performant and secure. They _want_ to say No to unnecessary features. This takes time, trust, money. What about the owners? The board? The executives? They make the decisions, want fast growth, more features, less expenses, software to be a commodity and engineers be interchangeable. Managers and sales people, who buy and sell software packages with fancy names on them to get a promotion, don't do the soul sucking work of integrating with a buzzword driven hodgepodge. Shouldn't they be held accountable first?
With that out of the way...
The example in the article is the Colonial Pipeline ransomware attack. A cyber _attack_. Are bridges, toilets and buildings generally built to withstand arbitrary hostile attacks?
But Lack of understanding of how decisions in the IT industry are made, ridiculous examples and bad analogies aside: I still think there is a good point hidden in there somewhere. Namely that good engineering takes time, investment, auditing, standard processes and a strong technological foundation. But we are not there yet. Nobody is laying the foundation to get us there. It's just iterations upon iterations and layers upon layers, features upon features.
Before we can even start to think about licensing SWEs, we need to start thinking _very_ long term. Much of the SWE effort is funneled into short term monetary gains, while fundamental technologies like OS's common file formats, browsers, firmware and all that stuff is rarely reconsidered and just patched over continuously. The foundations are _inherently_ insecure and the mindset is wrong. Familiarity and productivity often take precedence over simplicity and robustness. Everyone thinks they are entitled to telemetry and other user data and most software is closed source. How do you reason about black boxes? How do you trust them?
Also who will do the licensing? The people who tend to push for these things while shitting on other software practitioners don't inspire much confidence IMO.
I have so many questions...
[+] [-] musicale|3 years ago|reply
Sounds like a good idea, right?
> Poul-Henning Kamp spent more than a decade as one of the primary developers of the FreeBSD operating system
How does professional liability work when FreeBSD security flaws result in power plants shutting down or gas lines exploding?