top | item 15440848

Is Uncle Bob serious?

155 points| rbanffy | 8 years ago |dev.to | reply

171 comments

order
[+] magice|8 years ago|reply
The current state of software safety discussion resembles the state of medical safety discussion 2, 3 decades ago (yeah, software is really really behind time).

Back then, too, the thoughts on medical safety also were divided into 2 schools: the professionalism and the process oriented. The former school argues more or less what Uncle Bob argues: blame the damned and * who made the mistakes; be more careful, damn it.

But of course, that stupidity fell out of favor. After all, when mistakes kill, people are serious about it. After a while, serious people realize that blaming and clamoring for care backfires big time. That's when they applied, you know, science and statistic to safety.

So, tools are upgraded: better color coded medicine boxes, for example, or checklists in surgery. But it's more. They figured out what trainings and processes provide high impacts and do them rigorously. Nurses are taught (I am not kidding you) how to question doctors when weird things happen; identity verification (ever notice why nurses ask your birthday like a thousand times a day?) got extremely serious; etc.

My take: give it a few more years, and software, too, probably will follow the same path. We needs more data, though.

[+] maxxxxx|8 years ago|reply
I don't think you can compare software to other disciplines like medical, air transportation or architecture. These areas are well understood and pretty mature and move pretty slowly. If we ran air transportation like software somebody would already have self flying airplanes in service. They would crash from time to time though. I personally like the imperfection of software development and the freedom to imagine new things. If we want to be more mature we also have to accept much slower development cycles and innovation will be slower.
[+] sheepmullet|8 years ago|reply
> Back then, too, the thoughts on medical safety also were divided into 2 schools: the professionalism and the process oriented

The key difference is in the medical world safety has been a primary concern from day one.

I.e. There has always been a high level of professionalism.

That is not true in the software world.

Imagine a doctor saying it's 5pm on a Friday and I'm meeting a friend in an hour so I'll just do a rush job of this surgery and it will probably work out fine.

I've seen devs happily check in shoddy work just to be finished hundreds of times in my career.

[+] blub|8 years ago|reply
Aviation safety is also a domain to look up to.
[+] naasking|8 years ago|reply
> My take: give it a few more years, and software, too, probably will follow the same path.

I doubt it. The stakes are much lower than people's lives and health.

[+] walterstucco|8 years ago|reply
So basically work is safer when done by robots

Or if you spend a lot of resources in training people to robotize them

I prefer real robots

[+] planetjones|8 years ago|reply
This article by Uncle Bob has generated a lot of debate. I read it for the first time today and, I have to say, I am not impressed.

too many programmers think that schedule pressure makes it OK to do a half-assed job

This is solely blaming the programmer . I have been on software projects where someone is basically told deliver what you can by this date and then move onto something else. Then leave someone else to fix the bugs. This despite the programmer working very unsociable hours and trying their absolute hardest. What does Bob want the developer to do ? Refuse and be sacked, when they have a family to feed? Yes there are lots of lazy software developers in our industry who shouldn't be employed, but this kind of generalisation by Uncle Bob isn't helpful.

Better REPLs are not the answer. Model Driven Engineering is not the answer. Tools and platforms are not the answer. Better languages are not the answer

This is just nonsense. They are not the only answer, but of course they are part of the answer.

I stood before a sea of programmers a few days ago. I asked them the question I always ask: “How many of you write unit tests on a regular basis?” Not one in twenty raised their hands.

I would like him to provide evidence of who these people were and if they have any relevance to the article, as the article's jist is about serious and critical software. Are teams delivering such software really not writing automated tests ? I really doubt it.

Automated testing is great. Test Driven Development is a useful technique which I have in my arsenal. But Uncle Bob's obsessive focus on this can cause issues too. I once worked with a developer who had to write everything using TDD and apply EVERYTHING based on Bob's videos. It was too much. APIs weren't designed, data models weren't sensible and there was too much code (methods like isOpen, isNotOpen everywhere). It was clean code taken to the extreme. And not a good extreme.

I am not defending software developers as a whole. The levels of professionalism shown by some in our industry is at times scary. But a silly article like that which Uncle Bob originally wrote isn't helpful.

[+] EdSharkey|8 years ago|reply
As the only experts on the code, Programmers need to learn to say 'no' when appropriate. When they stay silent like code monkeys, they deserve all the blame Uncle Bob and I can heap on them.
[+] SideburnsOfDoom|8 years ago|reply
I am in favour of high standards in software, but Mr Martin always seems to come up short of specific measures to get there. He does not much beyond the "man up and show some personal responsibility" school of discipline.

This doesn't seem to me to be the right way to get there. I would prefer approaches that are driven by data, experiment and outcomes - i.e. what works. I would expect that ideas such as "blameless postmortems" after failures, which are inevitable; and encouraging openness and team safety would have a better outcome than this clenched "just don't fuck up" stuff.

https://www.inc.com/leigh-buchanan/most-productive-teams-at-...

https://codeascraft.com/2012/05/22/blameless-postmortems/

[+] jwdunne|8 years ago|reply
My problem with Martin's article is that his solution isn't a catch all. In fact, testing is aided by the examples he gives.

The things he moans about actually reduce the runtime state set of a piece of code by moving it to compile time. It reduces the amount you need to test.

With testing, you write code that ensures a certain set of states and, importantly, ensures against the complement of that set.

If your compiler can reduce the set of potential states, that means a few things:

1. Less testing code required. This is better because testing code is still code.

2. Catching more errors since it's easier to cover a reduced set of states.

3. You can focus your discipline on higher level problems. Discipline is great. It's even better when not focused on crap a compiler can pick up.

Tools are not the 'answer'. Tools are tools. If a tool helps solve problems, it should be used.

[+] tptacek|8 years ago|reply
Once again, because I never really get a clear answer for this question: why do we care? How is this not just another instance of "random dude wrong about something"? There are tens of millions of those.
[+] blub|8 years ago|reply
This person wrote a series of books about clean code which are sometimes recommended to junior programmers. Some people have a high opinion of him.

It's a good thing that his claims and experience are being questioned.

[+] tpush|8 years ago|reply
Because he is not random but influential, simple as that.
[+] pyrale|8 years ago|reply
Beyond criticizing Uncle Bob, this article shows what's being done to actually address the problem and puts forward lots of interesting material to read.
[+] maaaats|8 years ago|reply
Couldn't the same be said of yours or any other's comments here on HN?
[+] tempodox|8 years ago|reply
Anybody who calls himself Uncle <whatever> (outside of family context) can't expect to be taken seriously in my book. Maybe it's just me, but that seems like an appeal to authority right there in the name.
[+] latch|8 years ago|reply
Both better tools and better software discipline is needed.

Better tools could help a lot. It's hard to see how anyone disagrees with that. But, on the flip side, at my most pessimistic, I find that a massive % (say, 50%) of developers struggle to do the most basic things correctly. I can't fathom the types or scope of tools needed to solve the size of the problem (real AI that puts us all out of a job??).

So, while we wait for a thousand silver bullets, I agree with Uncle Bob: more automated tests, more pair programming, more code reviews. Whatever new tool they come up with, without basic competence, discipline and vigilance, things aren't going to change.

[+] le-mark|8 years ago|reply
This is what I don't understand about AI proponents and/or fear mongers. Human level AI writing code? Wouldn't that just be a lot of the same human level bugs? Then people say Super AI will evolve more in a day than we have in a million years. Sure, maybe, it's not known if that's even possible, at this point.
[+] shadowmint|8 years ago|reply
We already have tools that let us do the 'right thing' if we spend enough time and money on it.

Another relevant quote from that paper, with regard to formal verification methods:

> Any new software technology in this field must address both the cost and time issues. The challenge here is daunting because a reductions of a few percent is not going to make much of an impact.

> Something like an order of magnitude is required.

I think that's pretty much spot on.

I think that's really the issue I see; we don't need more arbitrary tools that just add yet more complexity to already complex systems.

[+] humanrebar|8 years ago|reply
> ...if we spend enough time and money on it.

This is a much deeper thought than it's getting credit for.

The tools and discipline are all there. People just don't want to spend the money for them. Developers and their leaders don't get raises and promotions for meeting quality metrics. Product owners don't listen to people who produce exceptionally quality products (they listen to people who say 'yes' to unrealistic plans a lot, regardless of quality).

Why don't people want to spend the money? Sometimes they literally can't afford it. The suits are still iterating on their business plans and haven't factored in the true cost of developing the product they will ship (which, as all software professionals know, is different than the product they think they're making).

Sometimes people can but don't afford the work necessary to product a quality product. Way too often I see developers give a reasonable estimate that gets stripped down because a deadline (usually arbitrary) budget or deadline has already been set. Likewise, I see estimates get pared down because they seem too big. I also see (probably unintentional) budget shopping, where managers ask for estimates from a few places and then (surprise!) think the smallest estimate is probably the accurate one.

So we can go on and on about tools versus discipline, but it all boils down to incentives and selection bias. We need to figure out how to communicate quality to stakeholders. Or, if that's to technically difficult, we at least need to figure out how to impart reputation to people who can provide subjective (but better) evaluations of quality for experts to rely on.

[+] k__|8 years ago|reply
I talked with a few mechanical engineers, a much older profession than software engineering. They have higher standards and whatnot, but I'm seriously happy that I don't work in that industry. Many I talked to even switched to SE because it's easier money and people in SE are more relaxed.
[+] _pmf_|8 years ago|reply
> They have higher standards and whatnot

Note that "higher standards" in civil engineering usually boils down to a) having standards at all instead of fuzzy process management frameworks b) these standards often boil down to "do this task/product with this regulatory mandated large margin of error" and c) building according to specification (i.e. having a reliable specification in the first place), which is exactly how reliable software is built (SIL, ASIL, aerospace level redundancy). There is nothing magical about it, but it needs to be driven by business and is not something that us lowly developers can just chose to do on a whim (because it increases costs by a factor 10 - 100).

Embracing agile (as his ThoughtWorks contract requires him to do) while lamenting quality and lack of professionalism, as Mr Martin does, is extremely dishonest.

[+] Silhouette|8 years ago|reply
The points made in the article linked here are interesting, but I now recommend anyone interested in improving the software industry simply steer clear of Bob Martin. Sometimes he says interesting or relevant things, but most of the time he just seems to be a professional troll these days. As far as I can tell he has a high profile but little relevant experience or qualification to make all these grand pronouncements or to justify insulting those of us who think other tools or techniques or processes might be better, so I suggest that for anyone interested in the field of high reliability software, time would be better spent studying books and papers written by people who have real experience and demonstrable results to support their arguments.
[+] jefe_|8 years ago|reply
It is very important to have QA people the engineers respect. What makes a QA person respectable? Strong understanding of the business logic. Clear communication skills. Strong awareness that they cannot build the system, but a strong understanding of how systems operate. They work as hard testing as engineers work building, and deliver their findings efficiently. They can accurately evaluate the magnitude of an issue. The analogy I think of is like an offensive line to a quarterback. The engineers get the headlines, but without a solid QA team that doesn't happen. If QA is doing a good job, make sure everybody knows it.
[+] EdSharkey|8 years ago|reply
Or, fire your QA team, who are insanely expensive manual test script running human robots.

Professional software developers have control of their code and don't shift the blame for their quality issues onto other teams.

[+] notacoward|8 years ago|reply
Part of being a professional is listening to the experts and, even more importantly, paying attention to the data. This applies to criticism of the software industry as much as to the industry itself. I think what rubbed a lot of people the wrong way about Uncle Bob's piece is that even as it called for greater professionalism none was evident in the article's own construction, and that seems rather hypocritical.
[+] agentultra|8 years ago|reply
My hunch is that tools like TLA+ and Lean[0] are bringing down the total cost of developing more reliable software.

And developing software that is reliable shouldn't be relegated to the safety-critical applications. There are ways that software failures can still cause significant harm even without putting human lives at risk: security errors allowing bad actors to steal personally identifying information, funds, or otherwise disrupt services that could harm people. This costs people their livelihoods, affects insurance rates for everyone, etc, etc.

I think the software industry needs to be more accountable for the influence we're having over the lives of the public and in order to write more secure, safer systems we need better tools to check our specifications, constrain our implementations, and help us manage complexity.

[0] https://leanprover.github.io/

[+] tenpoundhammer|8 years ago|reply
"I know there are tons of programmers churning out low-quality code."

People repeat this kind of sentiment all the time but is there anything to back it up? I know we've all run into to random bits of code we considered "low quality". But that can often be attributed to code that was written a very long time ago when programming practices were much different. At the time it was written it was likely high-quality code. It's like saying that a Gallbladder removal surgery from 20 years ago was "low quality" because it wasn't done orthoscopically, that technology was barely used at the time.

I'm starting to think that "tons of low-quality coders" is an industry myth to explain a variety of unrelated phenomena. The modern-day equivalent of the boogeyman. But let me know if I'm wrong.

[+] jacques_chester|8 years ago|reply
It is my continued theory that Bob Martin is a double agent, whose mission is to discredit TDD in the wider software community by using an infuriating, backlash-inducing writing style.
[+] blub|8 years ago|reply
Unfortunately he failed at that and people started to believe him.
[+] marcosdumay|8 years ago|reply
Is there anybody worth paying attention to out there pushing TDD?
[+] gaius|8 years ago|reply
Bob Martin famously tried to write a Sudoku solver using TDD, he gave up after a few blog posts...
[+] tempodox|8 years ago|reply
I largely agree with the article, with one caveat:

> ...we "just" have to specify the states and behaviors that are not safe and prevent the software from getting into those states.

In general, the number of ways how things can go south tends to infinity, while the desired outcomes are easier to enumerate. I think it would be safer and more feasible in most cases to identify states and behaviours that are valid than the other way round.

[+] cryptos|8 years ago|reply
Reading the paragraph about "illegal states" that should be avoided in software, I thought of "design by contract", what never really took off. Microsoft .NET has a really nice implementation called "code contracts", but it doesn't seem to be used that often. However it would be a simple and powerful way to improve the software quality.
[+] DanielBMarkham|8 years ago|reply
Background. When I first heard of Uncle Bob, I watched a couple of conference videos.

Man, the guy pissed me off. He seemed quite strident in this "software craftmanship" schtick.

As it turns out, I had written an article on all the horrible ways companies implement Agile. It's gotten to the point that I cringe whenever I hear the word "Agile", and I consider myself an Agile Technical Coach. Orgs just really suck at trying to do better. Most of the time it ends up in a micromanagement death march.

Bob read this. It pissed him off.

So Bob and I met online by pissing one another off. We commented on each other's works, and over the years, we've become friends. So I speak both as a coder, a consumer, friend, and a fellow talking head.

Bob means well, but with a large audience people tend to read into his work things that aren't there. This is the HN effect: with a large enough audience nothing you say or write will be immune from misunderstanding. He also tends to overstate his case from time-to-time, like we all do. Hyperbole is a useful rhetorical tool.

I restate his thesis in my own words as such: The number one problem in software today is hidden complexity, mutability, and state. When programmers enter a new domain, they naturally tend to "help" follow-on programmers by creating abstractions on top of whatever complexity they find. This increases, rather than decreases the hidden-state-mutability-complexity problem. It gives the appearance of being useful, but in fact can do more harm than good. Focusing on the tools instead of doing a good job gives this wonderful rosy picture of progress when in fact you're headed in the other direction.

It's not that tools are bad. It's that our natural inclination to add in abstractions easily leads to code where it's more important than ever to thoroughly test exactly what the code does. If we focused on the testing part first, the tools part wouldn't be an issue. But instead we focus on tools and schedule pressure, and this leads to total crap. We buy the tools/framework because we believe that schedule pressure forces us to work "at a higher level" but instead that same pressure, combined with the cognitive diffculties of adding yet more layers to the problems leads to a worse state of affairs than if we had simply skipped the tools to begin with.

I'll never forget the shocking wakeup I got as a developer when I realized I am a market for people selling me stuff, and these people do not have the interests of my clients in mind. They only have to sell me, not provide value.

And yes, you can argue that there's a requirements problem, but whenever something goes wrong, isn't there always a requirements problem? Nobody ever asks for a system that's broken, so whenever a system is broken, you can say "But you never told me not to do X" and be correct. The fact that requirements are a problem is tautological.

I stood before a sea of programmers a few days ago. I asked them the question I always ask: “How many of you write unit tests on a regular basis?” Not one in twenty raised their hands.

Bob's right. When you see results like this, stay away from as many tools as you can at all costs. You don't give hand grenades to infants, and programmers who aren't testing don't need faster ways to complexify the system in non-intuitive ways. The mistake this author makes is not realizing the reasons unit testing and TDD keep getting more important year-by-year. The mistake Bob makes is not diving down deep enough for some readers. "Craftsmanship" is a fine label, but there's a reason we need this stuff aside from just wanting to be professionals. If more folks understood the practical and pragmatic reasons for the zeal, there'd be less confusion.

[+] Caveman_Coder|8 years ago|reply
What responsibilities does an "Agile Technical Coach" have and what did you do for the business? (I'm curious because in most of the industries I've worked in, the teams have been extremely skeptical of Agile coaches/consultants)
[+] hellofunk|8 years ago|reply
Lots of talk here about better tools and better coding practices and better discipline, but what about better languages? Some languages make it impossible to make the kind of errors that other languages allow. And I think the languages will just keep getting safer and safer.
[+] sgift|8 years ago|reply
I'd sum language up under tools for the purpose of this discussion.