Stone cathedrals (the ones still standing, anyhow) were built by generations of skilled craftspeople guided by a self-perpetuating hierarchy. The hierarchy had access to, and the craftspeople had the benefit of, ways of extracting wealth from the economy. The availability of unlimited resources allowed them to cover for plenty of mistakes.
Sure, the cathedral structures look elegant and eternal. That's because time favors the stable ones. The other ones fell down long ago. It's the same reason that the orbits of planets are seemingly so well-ordered: everthing that wasn't in such an orbit has crashed into something else and disappeared. (Look at the surface of the moon for evidence of that.)
Music and literature work like that. We think the age of Mozart was particularly wonderful, because we have Mozart's music. But there was plenty of other music in that era, most of it bellowed by bad musicians in taverns. That's not so different from today's music scene.
Why shouldn't software work like that? Why isn't Dr. Kamp's lament morally the same as the lament of the guy with the cabin in the woods? I'm talking about the guy who finds himself with more and more neighbors, and grouses that "nobody respects the wilderness any more."
We all respect elegant lasting structures (fork / exec / stdin / stdout is one such structure). a creative field with huge output will generate more elegant lasting structures. We just can't tell which ones will survive from where we sit in the middle of it all.
> We think the age of Mozart was particularly wonderful, because we have Mozart's music. But there was plenty of other music in that era, most of it bellowed by bad musicians in taverns. That's not so different from today's music scene.
That's easily disproven-- even musicologists tasked with cataloging Mozart's output erroneously attributed symphonies and other pieces to him that were written by lesser known composers.
The reason attribution is such a hard problem is because it's non-trivial to separate the stylistic characteristics of late 18th century courtly and sacred music from the characteristics of Mozart's music we wish to claim were inspired by his genius. (There's a great paper written I believe by Rifkin on Josquin scholarship having been circular in just this way.)
What your royal "we" finds "particularly wonderful" about the "age of Mozart" is the style, not the particular composer. And there is plenty of well-written, beautiful symphonic, choral, and chamber music written by all kinds of composers of that period.
The modern world does not strive to have a composer in each and every town who can competently write compelling tonal music with the constraints of late 18th century form, texture, and counterpoint. As misguided as it may be, the longing of your royal "we" for the age of Mozart is a valid longing, regardless of what drinking songs bad musicians were playing in biergartens.
It's possible to go too far with this kind of thinking. Yes the pyramid still in pristine condition is the best built one, with all the shoddy pyramids long collapsed. That does encourage a bias. This does not mean any "golden era" thinking I necessarily fallacious, just that we need to watch out for a particular fallacy.
Between about 1967-73 a lot of good rock albums were recorded. A lot of shite too, mostly forgotten about. The filter of time applies equally to the next 7 years, but most people's record collections feature more "golden age albums."
> Music and literature work like that. We think the age of Mozart was particularly wonderful, because we have Mozart's music. But there was plenty of other music in that era, most of it bellowed by bad musicians in taverns. That's not so different from today's music scene.
Yep, it's the good old nostalgia filter in action. You see the good stuff that survives the passage of time, but ignore the crap that gets rapidly forgotten:
Your argument does make sense, but I'm now wondering if there was a significant number of cathedrals that collapsed? I don't remember ever hearing of such a thing happening without external factors like fire or war.
"Backward compatibility" is holding back creativity a fair bit, I would think. You can build two completely different cathedrals, both surviving the ages. But straying from the fork/exec pattern would make you completely incompatible with the current software landscape, and leave you with no users, and no lasting impact, no matter how "elegant" your solution is.
Having worked with plenty of software developed by the priesthood in the era of the cathedral might I observe that it was mostly fractally terrible. For every UNIX there were so many terrible pieces of software, and UNIX itself is pretty fractally horrible. (See “The UNIX hater’s handbook” for thorough criticism of UNIX as it was when when Raymond published his book.)
There‘s a single reason software doesn‘t work like that:
Bits don‘t fall apart.
If you can have a perfect copy of something you can easily keep it in existence forever.
If not, your artifact will sooner or later decay. It will only prevail through conscious action, being subject to incremental evolution of thought. Which copies are not.
fork is the exact example of a structure that is only elegant from the outside but entirely unworkable otherwise. It keeps getting fixed and patched and worked around because at this point in time this enormous blunder has blinded too many.
A big problem with his thesis is that I see just as much ugly crap in proprietary codebases where there is a clear structure of who is responsible for what.
The problem is often that the economics does not favor the perfect. I'm working on a project for a client now where I know there are lots of ugly warts in the backend because we've been building abstractions in parallel with building the rest of the system and have not always had the time to go back and revise the old code.
I would love to go back and sort that out. But to the client it is an extra expense now, both in terms of money, but more importantly in time spent now, vs. paying for it increased maintenance costs for people to - hopefully - clean it up over time later.
But there's also often the other cost of learning and determining if a shared piece of code is appropriate.
E.g. there's about half a dozen Ruby gems for handling ANSI escape codes, but for a recent project I ended up writing my own, because none of them seemed to cover what I needed, and it's not clear they should. A lot of code duplication happens because the cost of trying to avoid it often far outweighs the cost of maintaining code you know is doing exactly what you want it to.
I do agree with his hate for the autotools family, though.
>A big problem with his thesis is that I see just as much ugly crap in proprietary codebases
One of the reasons I suggested that folks take a look at previous threads[1] is that phkamp clarifies his definitions of "cathedral != closed source / proprietary / commercial" and "bazaar != open source".
It's not obvious in the acmqueue essay but phkamp wants "cathedral" to be synonymous with "coherent architectural design". As a result, his example of having a bunch of open-source hackers add duplicate dependencies such as a TIFF library when the final build of Firefox doesn't even read tiff files is not "cathedral" and therefore not "quality".
The acmqueue article is not written very clearly because he uses metaphors "cathedral / bazaar" which makes people associate "cathedral" with entities like Microsoft which he didn't intend and not associate "cathedral" with pre-1980s commercial UNIX which he did intend. Obviously, he intentionally reused "cathedral/bazaar" because it was a refutation of Eric S Raymond's metaphors but nevertheless, recycling the same analogy sent multiple people down a road of imprecise arguments. Indeed, one of the comments in the 5-year old thread is an ex-Apple employee (jballanc) categorizing iOS development as "bazaar" but phkamp classifies it "cathedral". Talk about torturing metaphors until people talk right past each other!
my observation has been that getting the first 90% of features/quality is pretty cheap and then as you start to try to close in on 100% features/quality for whatever your domain is the cost of improving software increases asymptotically somewhere short of 100% features/quality.
What people will pay for software seems to mostly depend of the costs of failure. 90% working software is almost always a better value prospect than 100% working software, unless your software failing means someone dies, or you place a trade that is off by $100 million.
In industries where the costs of failure are low (consumer, most internet, etc) the quality of software that the market will pay for is very low, and so the quality of the software being produced is very low.
Interesting ... a system that starts out as understandable/understood can degrade into one that still works, but is not understood (obviously ... we see it every day).
At that point, work (in the Physics sense) is required to lift it back to the outer energy level -- that of both working and being understood.
In many cases, a reasonable option will be to leave it at the degraded level and continue to utilize it (after all, it still "works").
This all seems obvious to me for the first time, as I look at it from this perspective. But it has never dawned on me before, probably because every inch of my soul longs for the beauty of the highest level of the hierarchy -- "works and is understood".
My experience as well, most closed source code I have to work with has an objectively worse quality than most open source code I choose to work with. Open source projects usually don't need to compromise quality to meet a deadline. This is critical for infrastructure and middleware, less important for user-facing products (thus it makes sense to develop the former in the open, and the latter under the typical restrictions of commercially developed software).
> A big problem with his thesis is that I see just as much ugly crap in proprietary codebases where there is a clear structure of who is responsible for what.
I don't think the contrast he's drawing is bazaar/open source vs. architected/closed source. It's versus strongly opinionated & or clearly-led open source, e.g. varnish, qmail, Rails, sqlite etc.
A great recent example is the ugly Rube Goldberg machine under the OSX GUI that led to the recent root bug. Look under the hood and it's really a mess.
Windows is even worse. It's amazing it works at all. Linux is generally cleaner in the auth department but has many other abominations.
The only clean software is either software nobody uses or software whose scope is rigorously limited. The latter is very very hard to defend when people are using it. Everyone wants it to do some new thing or be compatible with some other system.
"there is a clear structure of who is responsible for what"
In a large org it is never clear, who is responsible for what... ;P
And the result is sometimes even more code duplication, workarounds and bad examples of Conway's law in operation, than in free software.
I once worked in a project where I had the time to really make it perfect. Then I realised, when you reach that point, you find it's really not perfect. The world has changed, you've learned more etc. An incomplete solution that is used by many colleagues or customers is worth much more.
> ...there is no escaping that the entire dot-com era was a disaster for IT/CS in general and for software quality and Unix in particular.
The dotcom era was almost 20 years ago. The Netscape IPO (1995) is now closer to the IBM PC (1981) than it is to 2017. Yes, a lot of terrible engineering happened during the dotcom boom, and most of it was proprietary code running on old-school Unix servers.
If anything, Linux was a pretty solid alternative in those days. Sure, the GUI was a bad joke, and driver support was hit or miss. But if you chose your server hardware well, then Linux would give you years of uptime. Windows fell over several times a week. (Microsoft didn't get truly serious about quality and security until sometime around XP SP2, when they finally got sick of malware and buggy drivers.)
So in 2017, what does a "bazaar"-style project look like?
One modern example of a "bazaar" is Rust. (If you dislike Rust, you can insert plenty of other examples here.) All major design decisions are made through RFCs, after community feedback. There's no benevolent dictator, but rather a core team. Several key libraries like "serde" are actually maintained by separate teams. The compiler is released every 6 weeks. And yet, I have absolutely zero complaints about Rust's QA. There are plenty of test suites, and no code gets merged to master until those tests pass. To prevent regressions, somebody downloads all public Rust packages from crates.io, compiles them, runs their test suites, and checks for regressions. And so at work, we can upgrade our Rust compilers without fear.
Now, to be fair, Rust is not for everybody. Not everybody wants "a better C++ minus the foot guns," or enjoys references and generic types. But if you do want those things, Rust is a good example of a community-driven project which delivers reliable, well-designed software.
And I could point out 20 more modern open source projects with solid design and solid QA. We know how to do this now.
And those inexperienced dotcom-era graduates now have houses in the suburbs. They're planning how to pay for their kids' college educations, and more than a few of them have some grey hair.
I'm not sure if I'd consider Rust to be a perfect example of this style of design. Most language-level decisions seemed to be mostly dictated by what anyone was actually able to implement in the compiler, a lot of which was a nigh-incomprehensible mess that only a few people ever touched.
> Yes, a lot of terrible engineering happened during the dotcom boom, and most of it was proprietary code running on old-school Unix servers.
That's not true. Linux was very popular in 1998-2000 already, with RedHat and SuSE distros everywhere. Server software was usually written with CGI api, in C, Perl or PHP. Also Windows NT 4 and 2000 servers with ASP (classic) with web server software written in VBScript or JavaScript (yes, ASP classic had JavaScript/JScript support on server). Well known top websites of dotCom era like EBay and MySpace run on Windows servers back then. Back then Windows was a lot more popular on servers than nowadays. Also old school UNIX wasn't such a popular choice - name me five well known dotCom era startups that run on UNIX (in 1998 to 2001) instead of Linux or Windows.
>> Today's Unix/Posix-like operating systems, even including IBM's z/OS mainframe version, as seen with 1980 eyes are identical;
I just thought I'd clarify this for those among us fortunate enough to not have been acquainted with z/OS: the author does not mean that z/OS is a Unix. Rather, he is refering to the fact that there is a unix "mode" included in z/OS, Unix System Services, that can be started from inside the z/OS command line, TSO, to drop you to a Unix shell.
It's a full Unix OS, too, complete with, well, for example- ed. So you can experience the supernatural horror of getting stuck insid ed, inside the z/OS command line, on a mainframe. Which is a bit like being in hell and having a nightmare about being in hell and having nightmares.
Similar to DNA that has evolved to create living organisms. Each evolutionary step was selected because it worked, but the result is in no way optimized to be understood by human brains.
There's a hierarchy of systems:
* non-working, random, chaotic
* working, but not understood/understandable
* working, and clearly understood by at least 1 human brain
Clearly, each step in the hierarchy is less primitive than its predecessors. Also, since human brains are the greatest intelligence we know of, the 3rd bullet point seems like the pinnacle.
That is the sorry reality of the bazaar Raymond praised in his book: a pile of old festering hacks, endlessly copied and pasted by a clueless generation of IT "professionals" who wouldn't recognize sound IT architecture if you hit them over the head with it. It is hard to believe today, but under this embarrassing mess lies the ruins of the beautiful cathedral of Unix, deservedly famous for its simplicity of design, its economy of features, and its elegance of execution. (Sic transit gloria mundi, etc.)
I can say that, for the last seven years, I ran our little company as a Cathedral and not as a Bazaar.
We didn't import many packagess in PHP. We wrote our own code and made sure it integrated well with everything else.
We didn't run package managers and didn't freeze versions. We wrote all our own stuff in the app layer.
I can tell you, it's been... interesting. We know at the end of the day we are responsible for fixing stuff if it breaks. Not waiting for someone else to approve our pull request.
Since Raymond's book, something major happened: DISTRIBUTED VERSION CONTROL SYSTEMS such as git became mainstream.
So now you can build cathedrals IN THE BAZAAR!
Anyone who wants to maintain package X can fork it and maintain it. Maintainers gossip their improvements in a sort of blockchain consensus protocol. If a maintainer falls behind, the package continues to be maintained by others.
What I would like to see is MORE emphasis on content-addressable streams of data with history, instead of on the domains / urls from which they are fetched.
Blame the “how” all you want but the primary blame is in the “who”.
The industry has grown much larger and more quickly than the talent base. If the NBA were to suddenly add 200 teams, the average quality of each team would go down.
How many “pretty good” programmers do you know now sitting in Director level or CTO positions? How many programmers with good aptitude find themselves being promoted prematurely as an organization struggles to keep that talent on staff?
The quality of the average software team has been greatly diluted, the industry shows too little regard for experience, and hackers end up running the show on so many software teams. Naturally, the software looks like it was coded by hackers as a result.
One of the things I like about open source is its fecundity. When proprietary software ruled, we went to two or three companies to get most of our tools and libraries. Some were very good, some were terrible. We often had to wait years for features and bug fixes. Some key pieces seemed to never go away even though no one liked them. For instance, IE6.
In the open source world, bug fixes often come much more quickly. If fixes don't appear, we at least have the source and can try to fix it -- though sometimes that's a high bar.
If there is an obvious need for a tool or library, someone implements it and throws it up on GitHub. Sometimes the implementation is great, sometimes it is little more than a starting point for our own solutions. But at least we have the source to act as a starting point.
Quality is important, and open source often encourages good quality, but as the writer points out it does not always do so. Sometimes it acts more like a neural net AI algorithm that keeps failing and failing until one day it succeeds.
What open source does encourage is a rapid iteration as developers all over the world look for solutions to known problems. It's a messy solution, but it works surprisingly well. It also helps spread knowledge of how what the nuts and bolts of good software looks like.
If you're quoting numbers in percentage terms, and that term is more than 100%, there's proabably a better way to express it, in this case, "100 times" (or 100X if you prefer).
The BSDs are known for having a more vertical management structure than other open source projects so I guess the author is contrasting FreeBSD ports to the project itself. It would of been a stronger argument if it had been made explicitly.
OK, BSD ports have complex dependencies. That is optional, OpenBSD ports have relatively few dependencies. Is this a good thing or does it lead to more duplication? Packaging is a hard problem. What does the development model have to do with the result? The Bazaar idea is about developing the software in the first place, not what happens to it afterwards.
Bazaar is what happens when you design by a committee. Sometimes it just so happen that the project is small and the only committee members are rocket scientists and the project is about building a new rocket and painting it black. Having worked on a project like that, I can assure you it is an absolute beauty. The code that those guys shipped has been running in production on hundreds of millions of systems for nearly twenty years. It just works.
In the other cases, however, it is a project to build a new car with new kind of square wheels because committee is a band of people who have some members that have never seen a car, some members that think it should go on tracks of Japan, some members that just like coloring spreadsheets, some members that wandered in directly from a bar after a happy hour. Oh and dont forget that guy who did something in a security field of the US Army, who keeps insisting that the wheels should be designed to withstand an IED.
Most of bazaar projects are the 2nd, not the first.
What the author failed to mention was why any of this was bad. Yes, it's a mess. And I don't like it either. And I like carefully planning things as well, and it's a personal source of frustration working for start up companies. But it's a mistake to assume that something well crafted and (in a sense centrally) planned actually interfaces well with the market. Maybe it does, but that argument needs actually to be made.
I also was such kind of engineer for years, who had the desire to be a cathedral architect instead of a dusty, worn-out bazaar trader. But the truth is, the bazaar is more successful. It needs less system, less skill to join (more skill to master though), less luck.
Because there are 1200 libraries that basically do the same thing just with slight variation it means that if something is good in 500 of them a few of them will survive even a catastrophy like a bubble burst.
And really funny is that it is also more scalable. Without architecture and oversight many more of the integrations between these little, inexperienced silos fail. But because there are so many more who could start, again a few will make it because they were lucky enough to do the right experiment at the right time.
Surviving in such a jungle is of course harder, which is why we engineers, looking for a rather calm life, hate it so much. We want to focus on the code, not on survival. That's why we turn the facts and act like the Bazaar would be less successful where everything actually shows that it really is more successful.
If you are like me, and really want to become successful, even if it means leaving your assumptions behind and accepting that you might've been wrong, then look out for people who survive quite well in this jungle, and learn from them. It's learning by doing though, since what you write into a book today might already be irrelevant tomorrow. And there are books with the more abstract concepts out there, but without feeling and doing it, you can't even understand them (but will think you do anyways).
Real life feels more like learning Vim than learning Microsoft Word, but in a similar fashion it also offers rewards that are unexpected and come in sizes one wouldn't even dream about.
Check out SQLite [0]. Scroll down to the "How It All Fits Together" section, which points to several docs explaining its different parts, architecture, etc.
libmpv has very well documented C interface [1], although I haven't had any occasion to use it. I haven't dug through the project source either, so I'm uncertain of its quality level.
Fuchsia [2] looks very promising. Not everything is documented yet, but many of the parts which have docs are great. It's a bit of a rabbit hole though; when I first encountered it I spent a long time reading through their docs and poking around.
This isn't a case of young whippersnappers not knowing what a cathedral looks like. This is a case of the system getting so large that the combinatorics are no longer surmountable.
If I'm just starting on a program, and I realize that I zigged where I should have zagged, I just go through a handful of small files and make them zag. I believe Knuth did the same thing and re-wrote tex after his first attempt.
Once the system gets larger, updating every boundary and interface to something sane falls somewhere between costly and insane. Don't get me wrong, I'm all for a re-do, it's just that nobody in the private sector is going to foot that bill until we're in crisis mode, and the wizards who remember what the mistakes were aren't getting any younger...
When I read Cathedral and the bazaar I was genuinely surprised that it is not entitled "the cathedral _with_ a bazaar". Both have a role, e.g. the Linux kernel is a cathedral while GNU software is the bazaar - neither of them would make much sense without the other.
[+] [-] OliverJones|8 years ago|reply
Sure, the cathedral structures look elegant and eternal. That's because time favors the stable ones. The other ones fell down long ago. It's the same reason that the orbits of planets are seemingly so well-ordered: everthing that wasn't in such an orbit has crashed into something else and disappeared. (Look at the surface of the moon for evidence of that.)
Music and literature work like that. We think the age of Mozart was particularly wonderful, because we have Mozart's music. But there was plenty of other music in that era, most of it bellowed by bad musicians in taverns. That's not so different from today's music scene.
Why shouldn't software work like that? Why isn't Dr. Kamp's lament morally the same as the lament of the guy with the cabin in the woods? I'm talking about the guy who finds himself with more and more neighbors, and grouses that "nobody respects the wilderness any more."
We all respect elegant lasting structures (fork / exec / stdin / stdout is one such structure). a creative field with huge output will generate more elegant lasting structures. We just can't tell which ones will survive from where we sit in the middle of it all.
[+] [-] jancsika|8 years ago|reply
That's easily disproven-- even musicologists tasked with cataloging Mozart's output erroneously attributed symphonies and other pieces to him that were written by lesser known composers.
The reason attribution is such a hard problem is because it's non-trivial to separate the stylistic characteristics of late 18th century courtly and sacred music from the characteristics of Mozart's music we wish to claim were inspired by his genius. (There's a great paper written I believe by Rifkin on Josquin scholarship having been circular in just this way.)
What your royal "we" finds "particularly wonderful" about the "age of Mozart" is the style, not the particular composer. And there is plenty of well-written, beautiful symphonic, choral, and chamber music written by all kinds of composers of that period.
The modern world does not strive to have a composer in each and every town who can competently write compelling tonal music with the constraints of late 18th century form, texture, and counterpoint. As misguided as it may be, the longing of your royal "we" for the age of Mozart is a valid longing, regardless of what drinking songs bad musicians were playing in biergartens.
[+] [-] dalbasal|8 years ago|reply
It's possible to go too far with this kind of thinking. Yes the pyramid still in pristine condition is the best built one, with all the shoddy pyramids long collapsed. That does encourage a bias. This does not mean any "golden era" thinking I necessarily fallacious, just that we need to watch out for a particular fallacy.
Between about 1967-73 a lot of good rock albums were recorded. A lot of shite too, mostly forgotten about. The filter of time applies equally to the next 7 years, but most people's record collections feature more "golden age albums."
[+] [-] CM30|8 years ago|reply
Yep, it's the good old nostalgia filter in action. You see the good stuff that survives the passage of time, but ignore the crap that gets rapidly forgotten:
http://tvtropes.org/pmwiki/pmwiki.php/Main/NostalgiaFilter
[+] [-] slededit|8 years ago|reply
Even back then CreateProcess() would have worked just fine.
[+] [-] zouhair|8 years ago|reply
[0]: https://www.youtube.com/watch?v=_Qd3erAPI9w
[+] [-] matt4077|8 years ago|reply
[+] [-] roblabla|8 years ago|reply
[+] [-] Tloewald|8 years ago|reply
Having worked with plenty of software developed by the priesthood in the era of the cathedral might I observe that it was mostly fractally terrible. For every UNIX there were so many terrible pieces of software, and UNIX itself is pretty fractally horrible. (See “The UNIX hater’s handbook” for thorough criticism of UNIX as it was when when Raymond published his book.)
[+] [-] cel1ne|8 years ago|reply
Bits don‘t fall apart.
If you can have a perfect copy of something you can easily keep it in existence forever.
If not, your artifact will sooner or later decay. It will only prevail through conscious action, being subject to incremental evolution of thought. Which copies are not.
[+] [-] revelation|8 years ago|reply
[+] [-] BuuQu9hu|8 years ago|reply
[deleted]
[+] [-] vidarh|8 years ago|reply
The problem is often that the economics does not favor the perfect. I'm working on a project for a client now where I know there are lots of ugly warts in the backend because we've been building abstractions in parallel with building the rest of the system and have not always had the time to go back and revise the old code.
I would love to go back and sort that out. But to the client it is an extra expense now, both in terms of money, but more importantly in time spent now, vs. paying for it increased maintenance costs for people to - hopefully - clean it up over time later.
But there's also often the other cost of learning and determining if a shared piece of code is appropriate.
E.g. there's about half a dozen Ruby gems for handling ANSI escape codes, but for a recent project I ended up writing my own, because none of them seemed to cover what I needed, and it's not clear they should. A lot of code duplication happens because the cost of trying to avoid it often far outweighs the cost of maintaining code you know is doing exactly what you want it to.
I do agree with his hate for the autotools family, though.
[+] [-] jasode|8 years ago|reply
One of the reasons I suggested that folks take a look at previous threads[1] is that phkamp clarifies his definitions of "cathedral != closed source / proprietary / commercial" and "bazaar != open source".
It's not obvious in the acmqueue essay but phkamp wants "cathedral" to be synonymous with "coherent architectural design". As a result, his example of having a bunch of open-source hackers add duplicate dependencies such as a TIFF library when the final build of Firefox doesn't even read tiff files is not "cathedral" and therefore not "quality".
The acmqueue article is not written very clearly because he uses metaphors "cathedral / bazaar" which makes people associate "cathedral" with entities like Microsoft which he didn't intend and not associate "cathedral" with pre-1980s commercial UNIX which he did intend. Obviously, he intentionally reused "cathedral/bazaar" because it was a refutation of Eric S Raymond's metaphors but nevertheless, recycling the same analogy sent multiple people down a road of imprecise arguments. Indeed, one of the comments in the 5-year old thread is an ex-Apple employee (jballanc) categorizing iOS development as "bazaar" but phkamp classifies it "cathedral". Talk about torturing metaphors until people talk right past each other!
[1] no need to read all ~300 comments, just Ctrl+f search for "phkamp": https://news.ycombinator.com/item?id=4407188
[+] [-] saas_co_de|8 years ago|reply
my observation has been that getting the first 90% of features/quality is pretty cheap and then as you start to try to close in on 100% features/quality for whatever your domain is the cost of improving software increases asymptotically somewhere short of 100% features/quality.
What people will pay for software seems to mostly depend of the costs of failure. 90% working software is almost always a better value prospect than 100% working software, unless your software failing means someone dies, or you place a trade that is off by $100 million.
In industries where the costs of failure are low (consumer, most internet, etc) the quality of software that the market will pay for is very low, and so the quality of the software being produced is very low.
[+] [-] charlieflowers|8 years ago|reply
At that point, work (in the Physics sense) is required to lift it back to the outer energy level -- that of both working and being understood.
In many cases, a reasonable option will be to leave it at the degraded level and continue to utilize it (after all, it still "works").
This all seems obvious to me for the first time, as I look at it from this perspective. But it has never dawned on me before, probably because every inch of my soul longs for the beauty of the highest level of the hierarchy -- "works and is understood".
[+] [-] flohofwoe|8 years ago|reply
[+] [-] mattbee|8 years ago|reply
I don't think the contrast he's drawing is bazaar/open source vs. architected/closed source. It's versus strongly opinionated & or clearly-led open source, e.g. varnish, qmail, Rails, sqlite etc.
[+] [-] api|8 years ago|reply
Windows is even worse. It's amazing it works at all. Linux is generally cleaner in the auth department but has many other abominations.
The only clean software is either software nobody uses or software whose scope is rigorously limited. The latter is very very hard to defend when people are using it. Everyone wants it to do some new thing or be compatible with some other system.
[+] [-] szemet|8 years ago|reply
In a large org it is never clear, who is responsible for what... ;P And the result is sometimes even more code duplication, workarounds and bad examples of Conway's law in operation, than in free software.
https://en.m.wikipedia.org/wiki/Conway%27s_law
[+] [-] erikb|8 years ago|reply
[+] [-] jasode|8 years ago|reply
Reading phkamp's comments in those threads adds more context to the ACM essay.
[+] [-] bshanks|8 years ago|reply
Too many comments to list here in:
https://news.ycombinator.com/item?id=4407188
Three comments in https://news.ycombinator.com/item?id=12251323 :
https://news.ycombinator.com/item?id=12253059 https://news.ycombinator.com/item?id=12253175 https://news.ycombinator.com/item?id=12253027
Two comments in https://news.ycombinator.com/item?id=8812724 :
https://news.ycombinator.com/item?id=8814319 https://news.ycombinator.com/item?id=8813888
[+] [-] ekidd|8 years ago|reply
> ...there is no escaping that the entire dot-com era was a disaster for IT/CS in general and for software quality and Unix in particular.
The dotcom era was almost 20 years ago. The Netscape IPO (1995) is now closer to the IBM PC (1981) than it is to 2017. Yes, a lot of terrible engineering happened during the dotcom boom, and most of it was proprietary code running on old-school Unix servers.
If anything, Linux was a pretty solid alternative in those days. Sure, the GUI was a bad joke, and driver support was hit or miss. But if you chose your server hardware well, then Linux would give you years of uptime. Windows fell over several times a week. (Microsoft didn't get truly serious about quality and security until sometime around XP SP2, when they finally got sick of malware and buggy drivers.)
So in 2017, what does a "bazaar"-style project look like?
One modern example of a "bazaar" is Rust. (If you dislike Rust, you can insert plenty of other examples here.) All major design decisions are made through RFCs, after community feedback. There's no benevolent dictator, but rather a core team. Several key libraries like "serde" are actually maintained by separate teams. The compiler is released every 6 weeks. And yet, I have absolutely zero complaints about Rust's QA. There are plenty of test suites, and no code gets merged to master until those tests pass. To prevent regressions, somebody downloads all public Rust packages from crates.io, compiles them, runs their test suites, and checks for regressions. And so at work, we can upgrade our Rust compilers without fear.
Now, to be fair, Rust is not for everybody. Not everybody wants "a better C++ minus the foot guns," or enjoys references and generic types. But if you do want those things, Rust is a good example of a community-driven project which delivers reliable, well-designed software.
And I could point out 20 more modern open source projects with solid design and solid QA. We know how to do this now.
And those inexperienced dotcom-era graduates now have houses in the suburbs. They're planning how to pay for their kids' college educations, and more than a few of them have some grey hair.
[+] [-] erikbye|8 years ago|reply
Then you should point them out, if they are of a standard you think people should learn from.
[+] [-] Tobba_|8 years ago|reply
[+] [-] kayoone|8 years ago|reply
[+] [-] frik|8 years ago|reply
That's not true. Linux was very popular in 1998-2000 already, with RedHat and SuSE distros everywhere. Server software was usually written with CGI api, in C, Perl or PHP. Also Windows NT 4 and 2000 servers with ASP (classic) with web server software written in VBScript or JavaScript (yes, ASP classic had JavaScript/JScript support on server). Well known top websites of dotCom era like EBay and MySpace run on Windows servers back then. Back then Windows was a lot more popular on servers than nowadays. Also old school UNIX wasn't such a popular choice - name me five well known dotCom era startups that run on UNIX (in 1998 to 2001) instead of Linux or Windows.
@downvoter: care to explain?
[+] [-] YeGoblynQueenne|8 years ago|reply
I just thought I'd clarify this for those among us fortunate enough to not have been acquainted with z/OS: the author does not mean that z/OS is a Unix. Rather, he is refering to the fact that there is a unix "mode" included in z/OS, Unix System Services, that can be started from inside the z/OS command line, TSO, to drop you to a Unix shell.
It's a full Unix OS, too, complete with, well, for example- ed. So you can experience the supernatural horror of getting stuck insid ed, inside the z/OS command line, on a mainframe. Which is a bit like being in hell and having a nightmare about being in hell and having nightmares.
[+] [-] charlieflowers|8 years ago|reply
There's a hierarchy of systems:
* non-working, random, chaotic
* working, but not understood/understandable
* working, and clearly understood by at least 1 human brain
Clearly, each step in the hierarchy is less primitive than its predecessors. Also, since human brains are the greatest intelligence we know of, the 3rd bullet point seems like the pinnacle.
[+] [-] shady-lady|8 years ago|reply
not really. selected is the wrong word - survived is more appropriate. evolution isn't a set of preordained occurrences.
it's a random mutation which happened to survive - this can be due to many reasons (only one of which is "it worked")
[+] [-] EGreg|8 years ago|reply
I can say that, for the last seven years, I ran our little company as a Cathedral and not as a Bazaar.
We didn't import many packagess in PHP. We wrote our own code and made sure it integrated well with everything else.
We didn't run package managers and didn't freeze versions. We wrote all our own stuff in the app layer.
I can tell you, it's been... interesting. We know at the end of the day we are responsible for fixing stuff if it breaks. Not waiting for someone else to approve our pull request.
Since Raymond's book, something major happened: DISTRIBUTED VERSION CONTROL SYSTEMS such as git became mainstream.
So now you can build cathedrals IN THE BAZAAR!
Anyone who wants to maintain package X can fork it and maintain it. Maintainers gossip their improvements in a sort of blockchain consensus protocol. If a maintainer falls behind, the package continues to be maintained by others.
What I would like to see is MORE emphasis on content-addressable streams of data with history, instead of on the domains / urls from which they are fetched.
In other words, more IPFS and less HTTP.
That is the best of both worlds, and I describe it here: https://github.com/Qbix/architecture/wiki/Internet-2.0
[+] [-] trentnix|8 years ago|reply
The industry has grown much larger and more quickly than the talent base. If the NBA were to suddenly add 200 teams, the average quality of each team would go down.
How many “pretty good” programmers do you know now sitting in Director level or CTO positions? How many programmers with good aptitude find themselves being promoted prematurely as an organization struggles to keep that talent on staff?
The quality of the average software team has been greatly diluted, the industry shows too little regard for experience, and hackers end up running the show on so many software teams. Naturally, the software looks like it was coded by hackers as a result.
[+] [-] marcus_holmes|8 years ago|reply
[+] [-] pvg|8 years ago|reply
https://hn.algolia.com/?query=A%20Generation%20Lost%20in%20t...
[+] [-] ccalvert|8 years ago|reply
In the open source world, bug fixes often come much more quickly. If fixes don't appear, we at least have the source and can try to fix it -- though sometimes that's a high bar.
If there is an obvious need for a tool or library, someone implements it and throws it up on GitHub. Sometimes the implementation is great, sometimes it is little more than a starting point for our own solutions. But at least we have the source to act as a starting point.
Quality is important, and open source often encourages good quality, but as the writer points out it does not always do so. Sometimes it acts more like a neural net AI algorithm that keeps failing and failing until one day it succeeds.
What open source does encourage is a rapid iteration as developers all over the world look for solutions to known problems. It's a messy solution, but it works surprisingly well. It also helps spread knowledge of how what the nuts and bolts of good software looks like.
[+] [-] weego|8 years ago|reply
[+] [-] parasubvert|8 years ago|reply
[+] [-] scoot|8 years ago|reply
If you're quoting numbers in percentage terms, and that term is more than 100%, there's proabably a better way to express it, in this case, "100 times" (or 100X if you prefer).
[+] [-] wruza|8 years ago|reply
[+] [-] upofadown|8 years ago|reply
OK, BSD ports have complex dependencies. That is optional, OpenBSD ports have relatively few dependencies. Is this a good thing or does it lead to more duplication? Packaging is a hard problem. What does the development model have to do with the result? The Bazaar idea is about developing the software in the first place, not what happens to it afterwards.
[+] [-] carlsborg|8 years ago|reply
[+] [-] notyourday|8 years ago|reply
In the other cases, however, it is a project to build a new car with new kind of square wheels because committee is a band of people who have some members that have never seen a car, some members that think it should go on tracks of Japan, some members that just like coloring spreadsheets, some members that wandered in directly from a bar after a happy hour. Oh and dont forget that guy who did something in a security field of the US Army, who keeps insisting that the wheels should be designed to withstand an IED.
Most of bazaar projects are the 2nd, not the first.
[+] [-] AnIdiotOnTheNet|8 years ago|reply
[+] [-] orblivion|8 years ago|reply
Caveat: I haven't read Raymond's book myself.
[+] [-] erikb|8 years ago|reply
Because there are 1200 libraries that basically do the same thing just with slight variation it means that if something is good in 500 of them a few of them will survive even a catastrophy like a bubble burst.
And really funny is that it is also more scalable. Without architecture and oversight many more of the integrations between these little, inexperienced silos fail. But because there are so many more who could start, again a few will make it because they were lucky enough to do the right experiment at the right time.
Surviving in such a jungle is of course harder, which is why we engineers, looking for a rather calm life, hate it so much. We want to focus on the code, not on survival. That's why we turn the facts and act like the Bazaar would be less successful where everything actually shows that it really is more successful.
If you are like me, and really want to become successful, even if it means leaving your assumptions behind and accepting that you might've been wrong, then look out for people who survive quite well in this jungle, and learn from them. It's learning by doing though, since what you write into a book today might already be irrelevant tomorrow. And there are books with the more abstract concepts out there, but without feeling and doing it, you can't even understand them (but will think you do anyways).
Real life feels more like learning Vim than learning Microsoft Word, but in a similar fashion it also offers rewards that are unexpected and come in sizes one wouldn't even dream about.
[+] [-] jl6|8 years ago|reply
[+] [-] TheAceOfHearts|8 years ago|reply
libmpv has very well documented C interface [1], although I haven't had any occasion to use it. I haven't dug through the project source either, so I'm uncertain of its quality level.
Fuchsia [2] looks very promising. Not everything is documented yet, but many of the parts which have docs are great. It's a bit of a rabbit hole though; when I first encountered it I spent a long time reading through their docs and poking around.
[0] https://www.sqlite.org/src/doc/trunk/README.md
[1] https://github.com/mpv-player/mpv/blob/master/libmpv/client....
[2] https://fuchsia.googlesource.com
[+] [-] jstewartmobile|8 years ago|reply
If I'm just starting on a program, and I realize that I zigged where I should have zagged, I just go through a handful of small files and make them zag. I believe Knuth did the same thing and re-wrote tex after his first attempt.
Once the system gets larger, updating every boundary and interface to something sane falls somewhere between costly and insane. Don't get me wrong, I'm all for a re-do, it's just that nobody in the private sector is going to foot that bill until we're in crisis mode, and the wizards who remember what the mistakes were aren't getting any younger...
[+] [-] patkai|8 years ago|reply