top | item 20818537

No Silver Bullet (1986) [pdf]

110 points| tosh | 6 years ago |worrydream.com | reply

85 comments

order
[+] aedron|6 years ago|reply
The essay is remarkably prescient.

Brooks dared to predict that thirty years later, software would still be created by programmers sitting in front of editors, painstakingly typing up code branches for all of the scenarios a given program is supposed to handle. And their output would still mostly suck, because of the infinity of possible states, mostly due to variables and synchronicity. Complexity that cannot be abstracted away.

Given how starry-eyed we usually are about even the near future, that is bold.

The web, language advances, tooling - these are quality-of-life improvements. They don't 'solve' the inherent complexity of software development, the way it was promised by CASE tools (Brooks' likely target with his essay) and countless other 'business oriented' approaches. In fact those approaches have failed so many times that they have been largely abandoned, so in recent times, Brooks' essay might seem superfluous.

The one advance that might finally challenge the 'no silver bullet' rule is machine learning. Not yet, given that it is still an esoteric tool for a specialized class of problems, as part of traditional software systems. But with increasing computing power, I can imagine a future where machine learning can be set to work on broader tasks and start to look like magic self-directed software development.

[+] coldtea|6 years ago|reply
>The one advance that might finally challenge the 'no silver bullet' rule is machine learning. Not yet, given that it is still an esoteric tool for a specialized class of problems, as part of traditional software systems. But with increasing computing power, I can imagine a future where machine learning can be set to work on broader tasks and start to look like magic self-directed software development.

I'd stick with the silver bullet idea. I don't see machine learning going anywhere, and what passes as machine learning today is just a buzzword.

[+] crdoconnor|6 years ago|reply
>The one advance that might finally challenge the 'no silver bullet' rule is machine learning.

Definitely not. Machine learning is up there with 2012's nosql movement as one of the most overhyped silver bullets.

The improvements to programming over the last 30 years have been incremental more than anything in spite of the hype cycles of various silver bullets (OOP, functional programming, side effect free code, TDD, nosql, static types, etc.)

[+] otabdeveloper4|6 years ago|reply
> The one advance that might finally challenge the 'no silver bullet' rule is machine learning.

Machine learning can't even predict a simple binary proportion ratio. No, it's not a silver bullet.

[+] pjmorris|6 years ago|reply
I think fuzzing, like afl, is an example of the kind of machine learning that might be useful to software engineering. If we could just figure out how to fuzz requirements along with input fields, we might get somewhere.
[+] ken|6 years ago|reply
> Brooks dared to predict that thirty years later, software would still be created by programmers sitting in front of editors, painstakingly typing up code branches for all of the scenarios a given program is supposed to handle.

Did I miss this? The thesis says it's about what will happen "within a decade" of 1986 -- "as we look to the horizon of a decade hence". He makes no projections beyond that, that I see, and the timeline seems specifically chosen to suggest that further large improvements are still possible but will take time.

[+] lliamander|6 years ago|reply
Further prescience is in the areas that he predicted would be good areas at attacking essential complexity:

1. Build vs. Buy - I would say that open source has generally become the "COTS" solution that Brooks was looking for.

2. Requirements refinement and prototyping - agile development.

3. Incremental development - also agile, as well as different evolutionary architecture approaches

4. Great designers - the establishment of technical career ladders at many companies.

Again, none of these things are silver bullets, but they have contributed to significant progress in our industry.

[+] breck|6 years ago|reply
> Brooks dared to predict

I love this. So rare nowadays to scientists make bold predictions.

Disclosure: I work on Tree Notation (https://treenotation.org/).

Two years ago, in 2017 I predicted Tree Notation will be a silver bullet: by 2027 we will have a 10x improvement in reliability, productivity, and simplicity, thanks to the Tree Notation ecosystem.

2 years later, and thousands of experiments and conversations later, I'm almost positive that will happen.

[+] marktangotango|6 years ago|reply
I was at a Big Dumb Corp in 2005 or so on a client dedicated team. Our group manager decided to have everyone on the team take turns leading the weekly staff meeting. Mostly because he was lazy and the meeting was a particular waste of time. When it was my turn I printed off copies of this essay for everyone, handed them out, spoke a few sentences suggesting everyone read it. Then ended the meeting. That did not go over well at all with my manager.
[+] lliamander|6 years ago|reply
How did it go over with your coworkers?
[+] DonaldFisk|6 years ago|reply
I wrote a short essay on this four years ago, here: http://www.fmjlang.co.uk/blog/NoSilverBullet.html

The gist of it is that although Brooks was correct when he wrote No Silver Bullet, since then there's been an enormous increase in accidental complexity and, if that is recognized and removed, an order of magnitude improvement is now possible.

[+] breck|6 years ago|reply
This is a very good point. The Tree Notation ecosystem I work on (https://treenotation.org), would only have been an incremental improvement back 30 years ago. But now 1) cruft has accumulated and 2) there are new opportunities to exploit Tree Notation thanks to machine learning (for program synthesis, amongst other things) and visual programming, so the benefits at this time could hit that 10x number.
[+] dang|6 years ago|reply
That is an interesting twist on the argument.
[+] msandford|6 years ago|reply
This kind of thing reminds me of a phrase from cycling: It never gets easier, you just go faster.

I feel like that's absolutely how software is. As we have better frameworks and libraries and everything the job doesn't get easier. But what happens is that it's now possible to do something with a team of three that might have taken thirty before.

Just look at Python/django or Ruby/Rails vs doing all that yourself with C and cgi-bin. How big would the C project be before it started to become difficult to work on by virtue of its many lines of code?

I'm not saying these new frameworks solve any kind of difficult theoretical problems by the way. And it's still work to make a django or Rails site. But there are many startups that got going in the last decade with just a few people that during the dotcom bubble might have taken 30, 50, even 100 folks to try and turn into reality.

[+] segmondy|6 years ago|reply
It does get easier too. Better frameworks and libraries have made the job so much easier! The challenge I see is that people don't know how to code. They know the keywords of the language, they know programming constructs, design patterns, algorithms, frameworks, but they don't know how to program. You can know all the ingredients, you might even know the recipe but that doesn't make you a good chef or mean you know how to cook.

Something that I have not seen taught anywhere, no in schools, not at work, not on a thousand online courses is "HOW TO PROGRAM" This is why things have gotten hard for many people, they don't know how to program. But for those that do know how to program, things have gotten actually much more easier.

[+] goatlover|6 years ago|reply
Unless they were using Lisp or Smalltalk.
[+] dkersten|6 years ago|reply
> "There is no single development, in either technology or management technique, which by itself promises even one order-of-magnitude improvement within a decade in productivity, in reliability, in simplicity."

I think its important to remember that the statement is rather specific:

"no single development" and "which by itself" -- but there could be many developments, which, together, provide the order-of-magnitude improvement.

"order-of-magnitude improvement" -- but there could be smaller improvements.

"within a decade" -- the improvements may take longer.

I'm pointing this out because incremental improvements can and do occur, there's just no "silver bullet" to fast track the process.

[+] marcosdumay|6 years ago|reply
> but there could be many developments, which, together, provide the order-of-magnitude improvement.

He says very clearly that this is likely at the remaining of the text.

> the improvements may take longer.

I imagine the point of posting this on HN nowadays is because in 3 decades there wasn't any. Still, something could appear tomorrow, but I imagine it's very unlikely, for the same reasons that are on the article.

[+] mpweiher|6 years ago|reply
In No Silver Bullet Reloaded [1], a 20 year retrospective, Brooks said:

"Of the candidates enumerated in “NSB”, object-oriented programming has made the biggest change, and it is [unlike almost every other proposed solution] a real attack on the inherent complexity itself."

And then goes on to say that the most promising approach remains reuse, particularly of COTS programs.

[1] https://www.researchgate.net/publication/221321794_No_silver...

[+] lliamander|6 years ago|reply
> And then goes on to say that the most promising approach remains reuse, particularly of COTS programs.

Indeed, although I think more beneficial than COTS has been the rise of open source (which could be seen as a variant of COTS). From open source languages with rich standard libraries to application frameworks to whole applications (databases, operating systems, etc).

The benefit from open source hasn't just been the lower TCO of 3rd party software, but also changes in the development process. Perhaps the single biggest process change has been distributed version control. Not only has Git significantly reduced the accidental complexity of collaborating on software projects, it has also become the primary mechanism for distributing open source code.

[+] carapace|6 years ago|reply
Er, I learned Prolog last summer and realized that about half of my professional career was wasted because I didn't learn it sooner. Dunno if Prolog counts as a Silver Bullet but those folks have slain a lot of werewolves.
[+] goto11|6 years ago|reply
The elephant in the room: We have no reliable way of measuring and comparing developer productivity.
[+] yanowitz|6 years ago|reply
A great essay. I suggest Out of the Tarpit as a later exploration (20 years after No Silver Bullet) with fantastic and challenging analysis. If reading the whole paper is too daunting, checkout the summary (and subscribe to his regular email summaries of interesting papers) at: https://blog.acolyer.org/2015/03/20/out-of-the-tar-pit/
[+] pron|6 years ago|reply
The Tarpit paper, if taken as a response to Brooks, suffers from a similar problem to the one I discussed here: https://news.ycombinator.com/item?id=20829129

It tries to build a theory in an Aristotelian manner, i.e. not based on careful observation but mostly on rationalization (and maybe very partial, biased observations). The problem with rationalizations is that they can often be made to support any claim when the empirical picture isn't clear. An additional problem in this particular case is that when No Silver Bullet was published, the same kind of people (PL enthusiasts) made roughly the same arguments, but their predictions proved wrong, whereas Brooks's proved right. It's not the end of the story, but it does mean that their theory needs, at the very least, to be revised.

[+] breck|6 years ago|reply
If you liked this, years ago Fred Brooks recommended these books to me:

- DeMarco & Lister Peopleware

- 2007. Software engineering: Barry Boehm's lifetime contributions to software development, management and research. Ed. by Richard Selby.

- Hoffman, Daniel M.; Weiss David M. (Eds.): Software Fundamentals – Collected Papers by David L. Parnas, 2001, Addison-Wesley, ISBN 0-201-70369-6.

- And his: The Design of Design. Start with Part II.

[+] azeirah|6 years ago|reply
If the conceptual construct is made of concepts, and the power of high-level languages comes from being able to write software in concepts similar to the construct's concepts...

He states that the accidental complexity comes from mapping the conceptual construct to a real implementation.

But does that mean he's saying we already know about the essential complexity of a given problem? That doesn't seem the case to me? The way we express the conceptual construct can also be improved, no?

[+] azeirah|6 years ago|reply
Oh, I hadn't finished reading the article when I posted this comment.

Addressing essential complexity is discussed in the last part of the article ;x

[+] i_s|6 years ago|reply
> There is no single development, in either technology or management technique, which by itself promises even one order-of-magnitude improvement within a decade in productivity, in reliability, in simplicity.

Let's say it's true. Why should we care? Why set the bar at 10x, downplaying smaller (1.5x, 2x, 3x, etc) improvements?

Person 1: Hey, check out this tool that makes creating small web sites 3 times faster.

Person 2: Hah only 3 times faster? Who cares?

[+] marcosdumay|6 years ago|reply
Because if there were a few techniques that improved our efficiency 10 times either, we should make it our top priority to find and adopt those, because a couple of those are already a gamechanger.

But 1.1x to 3x? There is a huge lot of those. People do say "Only 3 times faster? Why care?" all the time, and go focus on some other development.

[+] ryanmarsh|6 years ago|reply
I love this paper and have used it repeatedly with clients over the years. I especially love the explanation of essential vs. inessential complexity (difficulties). A proto thesis on yak shaving, if you will.
[+] icrbow|6 years ago|reply
[+] pron|6 years ago|reply
The funny thing is that that post doesn't address any of the theoretical arguments made by Brooks. While certainly less rigorous than physics, his argument is analogous to a discussion of the speed-of-light limit, and that "Yes Silver Bullet's" argument is analogous to, "but what if we use a different kind of fuel in our rockets?" and then claiming that that fuel is the silver bullet with neither empirical nor theoretical evidence to support that claim.

Indeed, quite a few PL enthusiasts called Brooks's predictions overly pessimistic based on similar non-arguments back in the '80s (he lists some of them in his followup, No Silver Bullet, Refired), but reality proved his predictions to be overly optimistic. So he was right and they were wrong. The arguments in Yes Silver Bullet had already been put to the test failed. Is that reality a definitive proof that there is little accidental complexity left? Of course not, but it does raise the bar for those who claim there's a lot of it left, certainly well beyond simply asserting that that is the case or considering it a reasonable working hypothesis.

[+] olau|6 years ago|reply
From that comment I was expecting to find an essay claiming to have the silver bullets because hand-waving. I was not disappointed. It even listed functional programming.

Yes, the tech landscape of today is vastly different from 30 years ago.

But to me the magical silver bullet is really an observation of recurring bullshit - a competent programmer would never realize the gains claimed.

[+] MaxBarraclough|6 years ago|reply
> Since Fred Brooks published his essay, I believe that we, contrary to his prediction, have witnessed several silver bullets.

Isn't the whole point of a 'silver bullet' that you only need one?