top | item 28191884

What can be learned from studying long gone development practices?

159 points| Hell_World | 4 years ago |shape-of-code.coding-guidelines.com | reply

138 comments

order
[+] johnaspden|4 years ago|reply
I'm not sure you can call structured programming a fad!

An awful lot of languages (I'd say all commonly used ones) use if/then/else, do..while, for..next, and so on, but I can't remember the last time I saw a program with a complicated control flow done by gotos.

Such things were common in the 1970s, in fact the first professional program I ever read did it that way, and it took me ages to work out what was going on.

One particularly confusing technique was to goto a computed expression. Leads to all sorts of interesting bugs.

The whole point of structured programming was "Don't do that! You can do everything you want to do with a small set of restricted control structures, and the control flow is much easier to read".

I'd say that idea won so hard that we don't even notice it.

[+] 1vuio0pswjnm7|4 years ago|reply
"... I can't remember the last time I saw a program with a complicated control flow done by gotos."

Perhaps not "complicated" enough, but here's a fizzbuzz in spitbol (a fast SNOBOL interpreter). This is an unstructured, goto based language for non-numeric computation developed at Bell Labs. Would love to see a faster version in some popular, "structured" scripting language in the same number of characters. I find I can write scripts in more aesthetically pleasing ways than with "structured" languages. But of course aesthetics is subjective. (The term "subjective" here means what looks good to you might not look good to me, and vice versa.)

   ;var a = 0
   ;start a = a + 1
   ;break lt(a,101) :f(end)
   ;x01  y = remdr(a,3)
   ;x02  z = remdr(a,5)
   ;x03  x = y + z 
   ;x01a eq(x,0) :s(x06)
   ;x02a eq(y,0) :s(x04)
   ;x03a eq(z,0) :s(x05)f(x07)
   ;x04  output = 'fizz' :(start)
   ;x05  output = 'buzz' :(start)
   ;x06  output = 'fizzbuzz' :(start)
   ;x07  output = a :(start)
   ;end
Perhaps the unintended benefit of gotos is that it becomes foolish to try to construct overly complex control flow. Whereas stuctured programmming seems to encourage control flow complexity. The use of the phrase "complicated control flow" in the parent comment is a great example. It suggests for the author structured programming allows and arguably therefore encourages such complexity. But what benefit is served by creating "complicated control flow". Why not aim for simpler control flow.
[+] TeMPOraL|4 years ago|reply
> One particularly confusing technique was to goto a computed expression. Leads to all sorts of interesting bugs.

Huh. I suppose the modern-day equivalent is commonly done with higher-order functions - in particular, callbacks, CPS and returning functions. I've worked with people who find this confusing (particularly the last one - functions returning functions).

[+] lmm|4 years ago|reply
State machines and actors are occasionally cited as good ideas here, and they seem like much the same kind of unstructured programming IME.
[+] kevin_thibedeau|4 years ago|reply
> I can't remember the last time I saw a program with a complicated control flow done by gotos.

I've run across this in C demo code for a chip designed within the last 10 years. It was all goto soup in a god function.

[+] jay_kyburz|4 years ago|reply
I've been learning lua, and loops don't have a continue or a break. You can use a goto to jump out. I tried it but didn't like it. Might just be prejudices. :)
[+] walshemj|4 years ago|reply
Originally Fortran only had arithmetic if's

And I agree by the 70's spaghetti code with goto's was on the way out - though some early GWBASIC code had some gnarly practices.

[+] tsss|4 years ago|reply
There are things beyond for-loops and gotos, namely recursion schemes. It's been a long long time the last time I wrote a for-loop and I don't miss them.
[+] swiley|4 years ago|reply
Well puppet doesn't have loops. I hope that's not because the "fad" is ending.
[+] anyfoo|4 years ago|reply
The article says:

> Today, structured programming appears remarkably simplistic, great for writing tiny programs (it has an academic pedigree), but not for anything larger than a thousand lines.

Curious. I rather think that structured programming became so ultra-pervasive in any high level programming language younger than 50 years old or so[1], that we've lost the extra name for it. Practically every programming language that isn't either pure functional (rarer than it seems), low level (i.e. assembly), or very very domain specific (e.g. SQL[2]) is a structural programming language.

Examples for "structural programming languages" are C, C++, C#, Java, JavaScript, python, Go, Swift, Scala, PHP, perl, Rust, D, ... You get the idea. If a general purpose language does not follow it, it's something notable, like Haskell.

So rather, I think this is an example of something that was so utterly successful that it became absorbed into the general fabric of mainstream programming. It's true that we don't draw as many flowcharts as we used to, because we got more comfortable with everyday programming, but they are still how we think about code a lot, and for complex processes that we want to visualize we still do draw them.

[1] Not a scientific estimate. Substitute "not very very old" if you like.

[2] Funnily that stands for "Structured Query Language", but not necessarily for significant reason: https://en.wikipedia.org/wiki/SQL#History

[+] mamcx|4 years ago|reply
"structured programming" is not the use of flow-control but a way to design programs.

Even if most language have structured flow-control, a lot of people NOT do structured programming.

This is how I remember it back in the day and how I do it (curiously, much better on Rust, where it match better how was done on Pascal! ie: Not OOP, big on structs, functions, enums) and is a mix of:

- Define the major components of the app and put it on "modules" (aka their own file or folder if truly large)

- Define the major structures according to the domain (aka: make tables in sql). This is what POCOs are used today or plain Rust structs.

- Make procedures that operate on the above with a clear in/out discipline. Also, document that (this one, I always forget!)

So, is not that far away from functional, inmutable programming, only that:

- A lot was mutable and pass references. This is also a big part of how you discipline to define the flow make this tractable or not

- You don't have generics or classes, so you stick to plain data, plain vectors or lists and duplicate stuff (algos) here and there (no way to abstract "map/filter/folds" that I remember)

- You must be disciplined with the naming of "modules", "POCOs" and functions. Also limited in the length of names (like files only being 8 chars), causing the rise of Hungarian notation (that was misunderstood/misused badly!)

- All your pipeline are eager (not chance of iterators and stuff) so you repeat "loops" everywhere

- Everything was more self-contained, that is great... as long you don't forget the discipline.

"The discipline" was the key here, for make this actually "structured programming".

The MAJOR point is how you STRUCTURED the program, not the use of structured control flow!

[+] dhosek|4 years ago|reply
TeX was Knuth's first “real” programming in a few years. His comments on structured programming are a bit eyebrow raising. He had high praise for it and said that it allowed him to write the whole program without having to test small parts of the program (the book with the exact quote is upstairs and I'm too lazy to go get it).

That said, I'm pretty sure that TeX is more than a 1000 lines of code.

[+] timoth3y|4 years ago|reply
The real value of studying outdated development methodologies is that it teaches you how to think about new problems. It will almost never give you a plug-and-play answer for a problem you are facing, but it will give you a new a useful way of looking at things.

Brooks wrote The Mythical Man Month in the 1970s about his experience in the 1960s and it is still extremely relevant today.

When I was starting my software development career in the mid 1990s and trying to understand how to manage the process, one of the things I did was write to NASA and request a copy of their Manager's Handbook on Software Development.

This was not because I wanted to run things like NASA. That would have been be horribly inappropriate for a startup dev team. However, I wanted to understand an extreme; a process where spec writing, testing and on time delivery were prioritized.

I never used anything directly from that NASA handbook, but I learned a lot.

Almost 30 years later, I still have that book. It's one of my little treasures.

[+] yourapostasy|4 years ago|reply
For those wondering, there is the original handbook [1] available online today. Also an interesting paper on the process improvement lessons learned applying much of what was expressed in the handbook [2]. I believe the paper is of more immediate TL;DR use today for those who do not have the time to digest the handbook and internalize the lessons to draw from it.

[1] https://everythingcomputerscience.com/books/nasa-manage.pdf

[2] http://www.cs.umd.edu/projects/SoftEng/ESEG/papers/83.88.pdf

[+] urthor|4 years ago|reply
The analogy of programmers as general surgeons is so timeless it's downright creepy.

Nothing has changed in Brook's book because it's a book about human beings doing creative work, and humans haven't changed at all.

[+] kqr|4 years ago|reply
I disagree strongly with the authors assessment of the state of software development a few decades ago.

By the late 1960s, we had realised software development was hard. We put some good brains on the problem. Discussions between leading experts brought up several very important points, that we struggle with to this day:

- Naming things,

- Low coupling and high cohesion,

- Communication between developers,

- Communication with customers,

- Evolving prototypes,

- Avoiding the planning fallacy,

- Estimation,

- Support and understanding from upper management,

- and much, much more.

These are problems we struggle with today, but it's also the problems identified by and worked on by the experts of the late '60s into the mid '70s. We can learn a lot from what they discovered. (I sure have -- and I keep learning more!)

In fact, if I'm being a bit uncharitable, only three things truly seem to have changed since the late '60s:

- We have faster computers.

- We have virtual computers.

- We have higher level languages.

- We have version control.

Other than those four things, all advances in how to build software I've seen the last few decades are, in some sense, a rehash of what they found out early on.

[+] b0afc375b5|4 years ago|reply
> only three things truly seem to have changed since the late '60s:

> - We have faster computers.

> - We have virtual computers.

> - We have higher level languages.

> - We have version control.

> Other than those four things

I see off-by-one errors hasn't changed since the '60s.

[+] urthor|4 years ago|reply
Honestly there is a bit more than that.

The idea of mass development with large, open ended access to code is actually a surprisingly recent idea. Plus of course Stackoverflow has significantly changed the approach.

Concurrent programming is also significantly different to its 1960s iteration. COBAL and Fortran did not even attempt to do such a thing, the handling of all those asyncs was not around.

Many many other architectural details have also been implemented that were specifically designed not to be exposed to give classically trained programmers the illusion that things under the hood have not changed. However, they have significantly. Branch prediction leaps out at me, but also many other forms of evaluation, memory and processor design.

There is more than you describe that has changed.

If you would say that compilers and parsers have not changed, that would be extremely accurate. But there's a bit more to the world on either side of the compiler.

[+] fulafel|4 years ago|reply
Virtualization came around in 1972 so was not far away either!
[+] gwbas1c|4 years ago|reply
> I think the best management technique for successfully developing a software system in the 1970s and 1980s (and perhaps in the following decades), is based on being lucky enough to have a few very capable people, and then providing them with what is needed to get the job done while maintaining the fiction to upper management that the agreed bureaucratic plan is being followed.

Surprisingly, very little has changed.

[+] googamooga|4 years ago|reply
This management practice called “programmers anarchy”. It worked very well in my professional life too.
[+] charlysl|4 years ago|reply
I am a big fan of Jackson Structured Programming (the other JSP), from the 70s, and have used it many times. When it is the right a match for a given problem, the design and implementation will work first time, every time.

And that problem is where the input can be modelled as a stream of structured events.

This may sound too abstract, but it includes, for instance, pretty much any file. But the value lies more in that it makes it trivial to design a file format that is easy to parse, and process.

For me the value lies in that in the old days techniques were developed to systematically solve programming problems that will always we relevant, in this case processing a stream of events.

It is very well explained in the follwing MIT ocw lecture (it uses Java, but is actually language agnostic):

"Designing stream processors

Stream processing programs; grammars vs. machines; JSP method of program derivation; regular grammars and expressions"

https://ocw.mit.edu/courses/electrical-engineering-and-compu...

To practice it, you could try "Project 1: Multipart data transfer" in the link below:

https://ocw.mit.edu/courses/electrical-engineering-and-compu...

[+] vincent-manis|4 years ago|reply
`Structured programming' is actually used to refer to two different things: reducing or eliminating the number of gotos, and top-down programming. The former is universally understood to be generally a good thing (though there are still some good uses for gotos); the latter was the fad. The 1970s was the time of HIPO (hierarchy/input/process/output) charts. An early-1980s research project I was acquainted with at the time had an iron-clad rule: each procedure must be in its own source file. At least one well-known book in the 1980s gave definitions of software quality metrics that made object-oriented designs bad.

Pretty much every SE methodology has been a fad when it claimed to be the One True Way, and has offered useful ways of solving a certain kind of problem.

[+] flir|4 years ago|reply
Top-down/procedural fits really nicely with the unix command line philosophy. Loose coupling and pipes. You can understand why it was the de facto One True Way before GUIs.
[+] regularfry|4 years ago|reply
> each procedure must be in its own source file.

Funnily enough, this is how GNU libc is organised.

[+] galaxyLogic|4 years ago|reply
It is possible to produce software of great quality if you are willing to spend a lot of money on it. And if you want to build big system good quality is a necessity. Else you will discover like IBM did early on that fixing one bug on average can create 1.2 new bugs, or something like that.

The issue is that often the quality is good enough to have something that works and then the buyer/payer does not want to spend more. Problem with that is it makes the maintenance and future adaptations expensive and can turn the system into one where fixing one bug creates 1.2 new bugs.

Therefore I think it is wise to err on the side of caution and spend more on software quality than seems to be needed immediately. It is a bit like in the early days of bridge-building the bridges needed to be built stronger than needed because it was not possible to calculate the safety margins accurately.

[+] azta6521|4 years ago|reply
Yes and no. I'm too amazed by a world which longs for so many software developers. I mean do they all know that all that software has to be maintained? That software is one of the most expensive human artifacts today and that software rots like hell?
[+] mathattack|4 years ago|reply
The overwhelming attitude in a pre-agile world with a shortage of programmers was:

1) Create a division of labor so that tasks could be pushed to the cheapest person available.

2) Create a waterfall with documented buy in each step of the way.

3) Train an army of smart people for 6-8 weeks and cut them loose in specialized role. (Either a process person writing requirements or a tech person coding them or a tester confirming the code does what’s needed or change person teaching users)

4) Design and conceptual integrity of the whole were frequently missing.

5) The system buckled when requirements were wrong or changed.

6) Terms like “software factory” were used. When they’re used to today they reflect this mindset.

7) The best programmers were trapped in a system that wouldn’t promote them beyond a certain point because they weren’t generating revenue or managing a large group of people.

8) The sum of this was an industry notorious for overdue and over budget projects.

8.5) People prayed for their overdue projects to get cancelled before their portion would be blamed.

Software development isn’t perfect today but we are in a much better place.

[+] silisili|4 years ago|reply
I think the main problem with us engineers is that we always want to feel smarter than everyone else. Including our predecessors. Which leads to an ever-changing revolving door of doing things just to be different.
[+] aetherspawn|4 years ago|reply
> I think the best management technique for successfully developing a software system in the 1970s and 1980s (and perhaps in the following decades), is based on being lucky enough to have a few very capable people, and then providing them with what is needed to get the job done

> There is one technique for producing a software system that rarely gets mentioned: keep paying for development until something good enough is delivered.

Highlights.

[+] mianos|4 years ago|reply
I honestly think this is still true. Good people getting stuff done is still hidden, now behind some wagile (waterfall agile) in larger companies where the incompetent have been promoted to one step above their ability (the 'Peter Principle'), and the people who do the real work keep doing it.
[+] EbenFlutt|4 years ago|reply
I use Spitbol every day. A problem with it is that it allows godawful code like that example. Here's fizzbuzz in Spitbol: loop a = lt(a,100) a + 1 :f(end) output = eq(remdr(a,15),0) 'fizzbuzz' :s(loop) output = eq(remdr(a,3),0) 'fizz' :s(loop) output = eq(remdr(a,5),0) 'buzz' :s(loop) output = a :(loop) end

Even shorter: loop a = lt(a,100) a + 1 :f(end) output = (eq(remdr(a,15),0) 'fizzbuzz',(eq(remdr(a,3),0) 'fizz', (eq(remdr(a,5),0) 'buzz')),a) :(loop) end

[+] DrNuke|4 years ago|reply
The (still) relevant (best) practices from the past are all encapsulated into the standard libraries, that’s how CS pays its duties to the big minds of its past?
[+] canadev|4 years ago|reply
Lots of development practices come back again. E.g. techniques for optimizing code for computers sometimes make a resurgence for mobile phones a decade later.
[+] marcodiego|4 years ago|reply
You'll probably learn more by trying to understand why these practices are long gone.
[+] gwbas1c|4 years ago|reply
Or rediscover techniques forgotten by the latest fad!
[+] forgotmypw17|4 years ago|reply
I learned to write resilient HTML by aiming to support (and thus researching coding for) all browsers 1995 and up.
[+] biglost|4 years ago|reply
Asterisk dialplans :’(
[+] pravus|4 years ago|reply
My last job was working for a voip provider that used Asterisk for its SIP handling. I always referred to dialplan as "assembly for phone calls". Double :'(
[+] ptsneves|4 years ago|reply
Very shallow article. Tl;dr: none.

I would say that the mythical man month concept has stayed and still is talked about today even though the book itself is also hopelessly outdated except maybe in IBM style enterprise software.

[+] mavelikara|4 years ago|reply
> even though the book itself is also hopelessly outdated

What parts did you find dated? I haven't read it in the last few years, but I still get reminded of concepts like Second System Effect every once in a while.

[+] ardit33|4 years ago|reply
It was actually an interesting read.

TLDR: Whatever software building methodology/process you are using now, is probably crap, and made up by people that don't really know any better and just making things on the go.

Eg. He mentioned 'structured programing' was a fad at the time. Now it is Agile, and its offshoots.

Best way to deliver software is to have a handful of capable people and pay them well and get out of their way. The key to get paid well as a dev. is to be in a project that is important, life/death for the company.

I think the most interesting part of the article was the early job divisions while building software. It didn't make sense, but it was adopted due to the coorporate culture of the time.

[+] Lapsa|4 years ago|reply
stopped reading at "when I wrote my book"