A lot of what Blow says is not entirely accurate. For example he presents a simple picture of declining software quality over time, but anyone who was around at the time knows that both desktop OSes and desktop applications (including web browsers) were certainly much more crashy, and probably more buggy in general, than they are now. Likely quality has started to decline again over the past decade, but it's still not remotely back to where it was. It's hard not to suspect that Blow passes over this because it tends to contradict his "higher-level languages and more infrastructure → declining quality" argument. Section 7.4, "Programming Environments Matter" http://philip.greenspun.com/research/tr1408/lessons-learned.... of Phil Greenspun's, apparently, 1993 SITE CONTROLLER dissertation https://dspace.mit.edu/handle/1721.1/7048 makes the same "we don't expect software to work any more" lament which Blow delivers at 22:17 https://youtu.be/ZSRHeXYDLko?t=1337 :
> Another reason the "horde of C hackers" approach has worked remarkably well is that, although the resultant software is rife with bugs, most users have no experience with anything better. When an MBA's Macintosh Quadra crashes due to a C programmer's error in Microsoft Excel, he doesn't say "I remember back in 1978 that my VAX 11/780, with only one tenth the processing power of this machine, had memory protection between processes so that a bug in one program couldn't corrupt the operating system or other applications." Rather, he is likely to say, "Well, it is still easier than using pencil and paper."
but places the blame on a switch to lower-level languages and runtime systems. The improvements on the desktop over about the '00s seem to be attributable to (not an expert) the mainstreaming of, and continued development of, the WinNT and OS X platforms, increasing use of memory-managed languages and/or more recent versions of C++ in applications, and adoption of online crash-reporting infrastructure (though probably also increasing use of increasingly effective error-detection tools, which I assume Blow is fine with as they don't create a runtime dependency). So it certainly seems that Greenspun is more correct than Blow, which is certainly not to say that adding more layers of infrastructure has always been an unqualified good.
Also, Blow's talk has a very '90s focus on crashers, error messages, and the like, but many of the worst regressions in software over the last 10 or 20 years don't manifest as crashers or other straightforward bugs at all; and when they do manifest as bugs the bugginess is often intertwined with architectural issues in a way that makes a bug-hunting mentality relatively ineffective. For example, the pinnacle of WYSIWYG rich text editing was probably about Word 4 for Macintosh, which was a slightly awkward but workable mating of stylesheets to the WYSIWYG UI. Unfortunately it was something of a local optimum: further progress on the problem largely requires serious developer thought and/or further user education. So everyone more or less decided to instead pretend that rich text is a solved problem, and things have largely been gently regressing since then. Which is probably part of the deep background to the GMail rich-text jank Blow complains about at 23:47 https://youtu.be/ZSRHeXYDLko?t=1427 . “We can not solve our problems with the same level of thinking that created them”, as Lincoln said. ;)
leoc|5 years ago
> Another reason the "horde of C hackers" approach has worked remarkably well is that, although the resultant software is rife with bugs, most users have no experience with anything better. When an MBA's Macintosh Quadra crashes due to a C programmer's error in Microsoft Excel, he doesn't say "I remember back in 1978 that my VAX 11/780, with only one tenth the processing power of this machine, had memory protection between processes so that a bug in one program couldn't corrupt the operating system or other applications." Rather, he is likely to say, "Well, it is still easier than using pencil and paper."
but places the blame on a switch to lower-level languages and runtime systems. The improvements on the desktop over about the '00s seem to be attributable to (not an expert) the mainstreaming of, and continued development of, the WinNT and OS X platforms, increasing use of memory-managed languages and/or more recent versions of C++ in applications, and adoption of online crash-reporting infrastructure (though probably also increasing use of increasingly effective error-detection tools, which I assume Blow is fine with as they don't create a runtime dependency). So it certainly seems that Greenspun is more correct than Blow, which is certainly not to say that adding more layers of infrastructure has always been an unqualified good.
Also, Blow's talk has a very '90s focus on crashers, error messages, and the like, but many of the worst regressions in software over the last 10 or 20 years don't manifest as crashers or other straightforward bugs at all; and when they do manifest as bugs the bugginess is often intertwined with architectural issues in a way that makes a bug-hunting mentality relatively ineffective. For example, the pinnacle of WYSIWYG rich text editing was probably about Word 4 for Macintosh, which was a slightly awkward but workable mating of stylesheets to the WYSIWYG UI. Unfortunately it was something of a local optimum: further progress on the problem largely requires serious developer thought and/or further user education. So everyone more or less decided to instead pretend that rich text is a solved problem, and things have largely been gently regressing since then. Which is probably part of the deep background to the GMail rich-text jank Blow complains about at 23:47 https://youtu.be/ZSRHeXYDLko?t=1427 . “We can not solve our problems with the same level of thinking that created them”, as Lincoln said. ;)
leoc|5 years ago