OK, I'll make a couple of general observations here.
First: It would be a big help for this discussion, if we could have the informal convention that people who were employed in an IT job before 1990 marked their post. I think it would show a quite clear divergence of attitude.
Second: It's very obivious, that a lot of you have never been anywhere near the kind of software project posited in the "cathedral" meme, instead you project into the word whatever you have heard or feel or particularly hate. That's not very helpful, given that there is an entire book defining the concept (I belive it's available online on ESR's homepage, how about you read it ?)
Third: No, I'm not of the "either you are with us, or you are against us" persuation. The bazaar is here to stay, but having everybody in need of transportation buy the necessary spareparts to build a car is insane.
Fourth: Related to point two really: A lot of you seem to have little actual ambition of making things better, I guess that is what happens if you grow up in a bazaar and never even experience a cathedral. I pity you.
First: Have you ever looked at a plot of people for/against mariage rights for gays vs age? I suspect that the clear divergence does exist, but may not mean what you think it means...
Second: I worked on OS X. In fact, I worked on Snow Leopard, and the start of Lion. What's interesting about that, is that Snow Leopard was the last version of OS X developed according to the "Cathedral" model. Also, while the Snow Leopard cathedral was being built, iOS was being developed firmly using the bazaar model...
Third: You know, I wonder if you've ever been to a bazaar before? I live in Turkey, where the bazaar is a way of life (and where one of the largest, oldest bazaars in the world is located). I've never found "spare parts" at a bazaar. What I have found is some of the highest quality jewelry, tapestries, rugs, and other hand-made goods you'll find anywhere.
Fourth: I think you're conflating "good quality"=>Cathedral and "poor quality"=>Bazaar. The only thing that distinguishes the Cathedral and the Bazaar is whether or not there is one single individual in whose head the only valid vision of the completed project exists. You might do well to read up a bit on the history of Kapalıcarşı. Throughout its history there were guilds to enforce authenticity and all manner of quality control mechanisms. It is possible to have a Bazaar and a very high quality product.
Pretty much all your arguments about the Cathedral model being superior fail in the face of the most common cathedrals known to man: Microsoft Windows and Office.
Their inscrutable beauty is buried under tons of libraries nobody will ever touch for fear of breaking 20 years of development efforts, exactly like what happens in the Unix world. Their move to the 64-bit world was painfully slower than what their fellow merchants accomplished in the Unix bazaar. They still provide compatibility layers for programs built with technologies that have been thought of as extinct, like monks still praying to the gods of ancient Greece. Whenever they went for the "total reuse" mantra, they built terrible and insecure specifications (DCOM) that still saddle us 20 years later. And let's not even talk about portability, which is anathema: to each his own Cathedral and his own Faith, touch ye not any unbelievers!
So yeah, making mistakes and keeping around the cruft is something every long-running IT project can experience. Unixes are, arguably, the longest of them all, so the ones that naturally tend to show it more. Besides, there's a whole new world of applications to be built out there, if we were rewriting libtool every three months we'd move even more slowly than we do now.
<ad-hominem>Oh, and I wish I could say your rant is unbefitting of professionals of your age, but I'm afraid it's actually quite matching the grumpy-old-man stereotype you're clearly striving for.</ad-hominem> Hey look, I can do ad-hominem too, and I was born in the late 70s!
Ok, pre-1990 person here, and the piece resonated quite strongly with me. But I note that there is a third axis which isn't well covered, which is 'volunteer' vs 'paid'.
It is important to note the distinction between FreeBSD's package system and say Debian apt. In FreeBSD I can make from source some package, and in Debian I can apt-get install a package, because the Debian packages are prebuilt it just comes over in several chunks and doesn't need to build. (yes you can pull prebuilt packages for FreeBSD too). But my point is that the packaging system of FreeBSD, as used, conflates building the packages and using the packages.
So if I write a perl script that goes through the source code and changes all the calls to fstat() to match your configuration then to build I need perl even though you do not need perl. (as an example). But to run I don't care if you have perl or not.
But lets get back to the volunteer/paid thing again. People who volunteer rarely volunteer to clean the shit out of the horse stalls, no they volunteer to ride in the races and build a new horse. So you end up with a lot of stuff around you don't want.
Sadly for operating systems, and the original rant is really operating systems, there really isn't a cathedral/bazaar model, its more of a democracy / feudalism kind of thing. Nobody 'owns' the final user experience for the OS in FreeBSD/Linux discussions.
I think a lot of people, especially around here, eschew college education in the computer sciences, because you can get a job without it and you can build a website without it, but I really think the decline of formal education in computer programming concepts has led to a lack of cathedral thinking.
Of course, I'm biased, because I have such a degree. However, when I compare what I build to what is built by a business school grad who thinks anyone can learn computers, so everyone should be taught business, I really cringe at loss of formality in the industry.
Needing to build a computer processor from AND and OR gates really drives the concepts home. Building an operating system in C++ really drives the concepts home and creates a framework for thinking about computer based problem solving that's lost in many of the systems I look at today.
Started in IT in 1980. I'm 50 this year. First FOSS contributions were in Nov 87 (patches to gcc/gdb/emacs for convex machines). This pre-dates any contribution by ESR, and it enrages him when I point it out. :-)
My company ships quite a bit of FreeBSD (as pfSense). Could have gone linux, but linux is a mess (much worse than ports.)
I think OpenBSD is a mistake, at best it belongs as a group focused on security inside the netbsd project, but of course, Theo got kicked out of netbsd, thus: OpenBSD.
I'm also the guy who appointed Russ Nelson the "patron saint of bike shedding." Just FYI. :-) (None of Eric Raymond, Russ Nelson or Theo de Raat like me much.)
The first time I read of 'Cathedral & Bazaar' I thought ESR was illustrating and contrasting the BSD .vs linux development model. Only later did I understand that he was pointing fingers at GNU/FSF, not BSD.
To reply to point three, which is the only one not containing an ad hominem...
There exist several larger OSS projects, such as Apache, Boost, the Kernel, etc, which accept contributions but are also curated. Thus they represent sort of hybrid between cathedral and bazaar. People who use these projects know they are getting some (varying) standard of quality.
I think these sorts of projects -- often shepherded by some kind of noncommercial Foundation or Organization -- are the best way to get a mix of openness and quality going forward.
Why don't you talk about a few current "Cathedrals" and how they're different from "Bazaars", that might help the discussion too.
The only example that jumps out at me is the original Unix, and I think you'll agree that comparing software from 30 years ago that does vastly less than ... pretty much anything out there these days is not an entirely fair, nor useful comparison.
> Fourth: Related to point two really: A lot of you seem to have little actual ambition of making things better, I guess that is what happens if you grow up in a bazaar and never even experience a cathedral. I pity you.
I think people do want to make things better, but its happening much more decentralized, with small teams taking small safe steps on tools, libraries & frameworks.
Media, on-line banking, 3d printing, accessible hardware hacking, e-commerce sites, scientific/engineering software. The majority of these things are made by small teams plugging the best libraries and platforms out there together, where they already solves part of their software problem. Android, Cloud, Linux etc. It would be silly to say these were without ambition and don't contribute to the community.
(The average software devs Impact Factor may be diluted by the increasing number of people working with computers, but computers are so globally useful its inevitable. If good things are still getting made then who cares.)
Instead of giving us pity and RTFM, it would help if you clearly specify what you think are the defining and distinguishing features of the cathedral and the bazaar models. There seems to be a lot of confusion around that.
In this comment I will equate "Rug Market" with "Ready, Fire, Aim" and "Cathedral" with "4 Year Plan".
Firstly, the "Rug Market" beats the "Cathedral" when you haven't formulated the problem properly, and so you have bad specs.
Secondly, the "Rug Market" beats the "Cathedral" when bad software is more profitable than good software. Google for "Worse Is Better" and "The Innovator's Dilemma".
As someone who grew up as a programmer with assembly language, C, and C++, learning the good practices needed to make a 1 million lines of code C++ application work and be maintainable:
I've been equally disgusted by the evolution of programming in the last few years. HTML - which is, to a first approximation, always invalid. JS - which needs jQuery to make it mostly-but-not-completely cross-compatible among browsers. CSS, which hides so many hacks on top of each other, and which even needs a reset to be compatible. And now - distributed systems with pieces in PHP, Python, Objective C, Dalvik-Java, etc... and sustained by awesome-but-hackish fixes like Varnish and FastCGI and Nginx. where everything seems to be put togetther with duct tape.
But I'm slowly getting to the next stage, and having to admit - no, actually, admitting - that if it's spread like fire, there must be something to it, no matter if it hurts our taste so much.
And if you look well into it, it's very similar to biological evolution. Our own genetic codes and body plans are full of ancient pieces that are not needed any more (or at all, male nipples anyone), but it seems that it was more economical to "patch" things than to fix things properly. Or at least, it didn't hinder the current designs enough that they wouldn't succeed over the alternatives evolution probably also tried.
And now Google translates text statistically, without even trying to understand things.
One fear I have is that we will be able to create a working AI in a few years... and due to the way we do it, we may even not understand how it works.
Nitpick: jQuery exists to make the HTML DOM manageable, not JavaScipt the language. (There are libraries targeted at javascript the language, but you could make an argument that that's the point of all libraries for all languages...)
I find jQuery more analogous to tools like configure and autoconf. (in that they aren't actually needed for "modern", standards compliant browsers.) As an example: there are already many lightweight drop-in replacements that assume a sane browser to begin with.
CSS is remarkably hack-free and the resets are just to remove the default styling that browsers have built in. Obviously browsers need builtin styles otherwise all the old pre-css pages would stop working.
And why are you calling programming languages and web servers hackish? How is nginx hackish and apache not? How is Objective C hackish and C++ not? Why stop inventing new languages at assembly, c or c++?
Other than that I generally agree with the rest of your comment. I think if we ever get to some kind of technological singularity we'll almost certainly and just about by definition not understand how it works. And true/strong AI would definitely be a singularity. Even if we understood version 1 we would likely not be able to understand whatever it dreams up 5 seconds after we overclock it. And we will - moor's law and all that. ;)
>And now Google translates text statistically, without even trying to understand things.
This comment seems to mostly be, "I'm going to complain about software disciplines and lines of CS research that I don't understand because their practical approaches don't conform to my finicky aesthetics."
If you want to formulate Strong AI to solve NLP as a problem, go for it. The rest of us are happy to use Google as it is.
This article reads like an old timer feeling left behind by the current rate of progress who thinks that the problem is really that the rest of the world is doing it all wrong.
He's probably right that lots of software could be designed better, but I think he's wrong that that's of paramount importance. We need lots of software these days and we simply don't have the resources to built it to Kamp's standards. Also, we've learned that our requirements change so fast that his beautiful design would quickly be twisted into the pile of hacks that he hates.
He notes how long it takes to compile the software that runs on his work machine---how long would it take to compile the software on a windows machine? (Or whatever he would claim is the standard-bearer of his cause).
He also attacks open-source software. The truth is, we have an amazing amount of free and open source software available. Some of it may be flawed in design or usability, but it enables us to solve so many problems (and look at/modify the software when needed). I don't think that all software needs to be "free", but I do think free software has made us all much richer. I have a hard time seeing all this value that the Bazaar has created as inferior to slow-moving, centralized, big, up-front design development.
I wonder if you bothered to check who wrote that article, before you started speculating about things you could have found out with a few google searches ?
You seem to assume that cathedrals are "slow-moving, centralized, big up-front" designs, where did you get that idea ?
Ever looked into how USA put a man on the moon ?
Maybe you should. Also: Read Brooks book, if you can.
Some software is extensively designed up front, like embedded avionics programs. The resources are made available because folks like the FAA demand it.
Why don't we employ rigorous software engineering principles to, say, iOS games? It would be a waste. It could be done, but in practice, people just don't care as much if a game on their phone crashes as they do if an airplane they're riding in crashes.
There's a wide spectrum of software out there, needing varying levels of robustness. There's not a lot that can be said about software development as a whole along these lines; inadequate design for an online banking site might be excessive design for a llama costume competition voting site.
Architects don't build cathedrals anymore, either. Neither do most software architects. Yet, look what they have built: an incredibly diverse collection of structures of all shapes and sizes, all working together to form an imperfect yet efficient system. Surely its success is due in part to its flexibility and imperfection, and the fact that they are no longer over-engineered and inflexible behemoths made from stone.
If you try too hard to design a rigid structure around anything, it can come back and bite you in the ass. Part of the reason why this seemingly disorganized and haphazard collection of buildings makes a working and thriving metropolis is because of its diversity and resilience to change and progress. If you attempt to over-engineer something of such complexity, you might just end up with Pyongyang instead of New York. Instead, we have a competing ecosystem of libraries and options with the good ones theoretically rising to the top. Better communities and communication of these values make this process work even better, just as in the greater economy. For all its complexity, this process works surprisingly well.
For all the explaining Kamp did in this article, the one thing he failed to account for was the resounding success of the current model. He did use a surprisingly apt metaphor, however. Cathedrals are no more; their old place as the center of society had to be explained by a series of myths and lies that placed ideals above reality. The thriving Bazaar easily replaced them at the center of any thriving modern city, based on the simple truth that the dynamic edge of reality was ever-changing, and that quality based on that reality would indeed be more successful.
Could it be better? Of course. But it is reality and truth that will move us forward—not mythology. Good riddance to cathedrals.
Since 2001, we've started in on the XP/Agile movement, which, think what you will of it, test-drives and version-controls relentlessly, and fosters a constant dialog about what quality is and how to achieve it.
Furthermore, some marvelous tools have been written in the last decade; I can hardly see how the author can complain about version-control systems, as Git and Mercurial--both miles better than what was available in 2001--are not least among them.
Other quality-enhancing things that have happened since 2001: continuous integration, build pipelines, Selenium and friends, behavior-driven development, REST architecture triumphant over the cathedral-like SOAP and XMLRPC, JSON over XML, lightweight messaging, the adoption of small, focused open-source tool over "quality-focused" large vendor bullshit-enterprise tools.
To the extent that UNIX sucks now, which it doesn't, it's because the hackers who work on it now have lagged the industry--not the other way around. So go find some other group of kids to shoo off your lawn.
You mean: "In 2001 we reinvented XP/Agile because we couldn't be bothered to read the old literature to see if somebody else had done something like that before" ?
I'm another grumpy old man, much like I imagine the OP. I have mixed feelings about this topic.
I found the autoconf comments amusing, we have supported a pretty broad range of platforms, from arm to 390's, IRIX to SCO, as well as Linux, Windows, MacOS, and our configure script is 157 lines of shell. Autoconf had its place but I think it was much more useful in the past and now it is baggage. The fact that it is still used (abused?) as much as it is sort of speaks to phk's points.
I think the jury is still out on whether the bazaar approach is better or not. It sure is messier but it also seems to adapt faster. I worked at cathedral places like Sun, and while I continue to emulate their approach I also question whether that approach is as nimble as the bazaar approach.
I've voted for the older style of more careful development and I think it has worked pretty well for us, we can support our products in the market place and support them well. The cost of that is we move more slowly, we tend to have the "right" answer but it takes us a while to get there. Bazaar approaches tend to come up with answers of varying quality faster.
It's really not clear to me which way is better. I'd be interested in hearing from someone / some company that is supporting a lot of picky enterprise customers with some sort of infrastructure product (database, source management, maybe bug tracking) and making a success of it with a bazaar approach. Seems like it would be tough but maybe I'm missing some insight.
The odd part about this “get off my lawn” article is that PHK has already shown how to fix the problem: Varnish is both a very good tool and one which has gotten attention and compliments for rejecting obsolete convention (e.g. relying on the VM, requiring a C compiler to be installed on a server, etc.). I would love to see autoconf massively simplified or outright avoided for most projects and the best way to do that would be to start providing good examples of how unnecessary it is even for major projects.
The way to address oversights in the bazaar model isn't to cram everyone back into the cathedral but to build support for change by showing where something is clearly better.
One area where PHK might see this is the way Linux has flown past FreeBSD - not due to endlessly-debated questions of kernel superiority but rather because Linux distributions like Debian provided a clearly superior experience for the overhall system by rejecting the decades of accumulated hacks (i.e. the ports system, monolithic config files, etc.) which had been the status quo for years and building better tools to reduce management overhead.
>Linux has flown past FreeBSD . . . because Linux distributions like Debian provided a clearly superior experience for the overhall system by rejecting the decades of accumulated hacks
Debian contains tons of accumulated hacks that I wish would be rejected.
Linux has flown past BSD for reasons other than technical superiority, and more to do with the politics and ecosystem of software, especially related to lawsuits, uncertainty, network effects and momentum.
A very thought-provoking article. (Been programming for pay since 1966 here.)
A lot of the commentary seems to be confusing the waterfall model with the idea of the cathedral.
Another way to make this point is to note how few programmers these days read Dijkstra, or Knuth's TAOCP. (Knuth would have us believe that even he doesn't read it. See Coders At Work.) Among other things, Dijkstra taught understanding the entire program before setting down one line of code. Contrast this with TDD (which, believe it or not, some take to mean Test Driven Design).
Lately I have been in the Application Security business, and nowhere has the issue highlighted by the article been more obvious.
Edit: Mark Williams Company, inventor of Coherent OS, was not a paint company. It started out as Mark Williams Chemical Company, manufacturing Dr. Enuff, a vitamin supplement.
Yes, the bazaar is like evolution: messy, inefficient, and slow. Lots of bad ideas are tried; a lot of them stick around for as long as they provide more value than they subtract; and progress takes a long time. Just as the human body has components that are useless today (e.g., the coccyx), evolving software ecosystems always carry a lot useless baggage. That's how evolution works.
But Evolution copes better than intelligent, top-down design with the evolving constraints of a market landscape that is constantly shifting.
Lambasting libtool for providing a consistent experience across star-NIX is, imo, not the wisest move for a FreeBSDer.
Article: This is a horribly bad idea, already much criticized back in the 1980s when it appeared, as it allows source code to pretend to be portable behind the veneer of the configure script, rather than actually having the quality of portability to begin with. It is a travesty that the configure idea survived.
Good high-minded notions here. But configure, with it's standardized parameters for how to do stuff, is near irreplaceable at this point. Certainly a more stripped down version, one not built on M4, would be wise, but libtool/autoconf itself is used too broadly & with trepid familiarity by developers & upstream maintainers: in spite of so much of it being indeed old deprecated no longer useful cruft, the best we can hope for is duplicating a wide amount of the existing functionality in a cleaner manner.
But at what cost would reimplementation come? How many weird build targets would for months or years go unnoticedly broken?
The place where we escape these painful histories is where we leaving the old systems programming languages behind. Node's npm I'd call out as a shining beacon of sanity, enabled by the best most popular code distribution format ever conceived: source distribution, coupled with a well defined not-completely-totally-batshit algorithm for looking for said sources when a program runs & at runtimes goes off to find it's dependencies:
http://nodejs.org/docs/latest/api/modules.html#modules_all_t...
I think FreeBSD, and the Linux distributions do try to cater to too many different people, and quality and coherence suffers a lot from this. I think we can get past this though. The culture of testing and good code is on the ascendant again in many quarters. You need more people to understand build, packaging and distribution better, sure. You also need autotools to die, as the use cases for it are mainly dead. You can generally write portable code to the systems that matter if you want to now, and it just works.
A lot of the problems are due to poor integration between languages, so for example the JVM people have reimplemented almost everything, as have the C++ people.
"Later the configure scripts became more ambitious, and as an almost predictable application of the Peter Principle, rather than standardize Unix to eliminate the need for them, somebody wrote a program, autoconf, to write the configure scripts."
I'm not sure I understand how the Peter Principle applies here? Autoconf seems a rational solution in the economic sense: to standardise Unix, you need to have lots of influence to effect buy-in, whereas to write autoconf, you need a big dose of hacking talent. If you have the latter and not the former...
I had a hard time understand it as well. I interpreted it as the Peter Principle also applies to software rising to the level of it's incompetence, not just people.
Right, libtool is the same - the people that wrote it weren't in a position to demand that all the UNIX-likes out there standardise their ld flags, so they routed around the problem instead.
Autoconf is an easy target for these kind of rants, but you know what? It does its job, and it does it very well. The ratio of autoconf to non-autoconf programs on my system is probably 10:1, but the ratio of build problems is something like 1:20.
If anyone ever managed to write a genuinely better build system, the bazaar would let it rise to the top; the gradual rise of e.g. cmake is testament to this. Trying to impose one solution top-down, e.g. the LSB standardization of RPM, has a far worse track record than letting the bazaar do its thing.
When OpenBSD replaced GNU libtool with a home grown perl version, it was so much faster I believe it literally cut days off machine time off a full ports build. For smaller packages, with tiny C files, running libtool.sh takes longer than running the compiler does. The majority of build time for some of those packages is still running configure, testing for things like <stdio.h>, which the package provides no workaround when missing. The OpenBSD project alone has spent years of machine time to running configure and libtool.
As for doing its job well, the failure mode of configure "you can't build this" is abysmal. Just give me a fucking Makefile, I'll fix it myself. I love packages that come with Makefiles that don't work. I pop "-I/usr/local/include" into CFLAGS, run make again, and boom. Done. Trying to do the same with configure? Forget about it. --with-include-dir or whatever doesn't work, because it's really running some bogus test in the background which expects /bin/bash to exist and so on and so forth.
I hear your statement about build problems as soon as someone does not use autoconf quite a bit.
However, in the past I’ve had the opposite experience: Trying to port software such as Apache, PHP or bacula to UNIX systems such as SGI IRIX, I always ended up writing a simple Makefile to compile the software instead of putting up with the multitude of autotools-fixing that would have been required. I reported one or two clear issues upstream and they have been fixed, but until the relevant fixes arrive at the projects (especially PHP came with an old version of autotools), some time will pass.
As a counter-example, take i3-wm: it ships with a GNU Makefile (OK, multiple Makefiles in subdirectories, one for each tool) and compiles on Linux, Mac OS X, FreeBSD, OpenBSD, NetBSD. Now let’s have a look at each of these:
I would argue that porting i3-wm to another platform is easy because you can understand the Makefiles and it’s very clear what they do.
As a conclusion, I just wanted to show you that there are counter-examples to both: situations where autotools really does not do a good job and situations where you can deliver good Makefiles without using autotools at all.
I doesn't change the fact that it's terrible and that better alternative exist (for instance, CMake), thereby making it yet another cruft from the past.
Yeah, but cmake (with roots to GE corporate development and large government contracts) isn't exemplary of bazaar development. If anything it underscores the inferiority of the bazaar.
I write this as a thankful and proud user of Varnish; which =I assume= phkamp would place as a cathedral type of software. I agree with; quality requiring someone being responsible, importance of code reuse, and the observation of full 10 years of dumb copy/paste (though i believe in copy/paste). Eric Raymond's distinction of Cathedral/Bazaar is somewhat a useless dichotomy, on the problems mentioned. Here are 2 examples how Varnish could do better (independent of being developed as bazaar or cathedral)
1) Varnish does not support SSL, which complicates deployment architectures; and part of the the blame goes to lack of a good ssl implementation to copy from or to write an SSL proxy; https://www.varnish-cache.org/docs/trunk/phk/ssl.html I find this as an example of why neither bazaar nor cathedral based approaches yielding something "acceptable" in past 15 years, to copy and paste from.
2) Varnish VCL, as a DSL looks as obscure M4 macro language and the functions are not well planned; and another guy (from the bazaar) would design some more coherent language.
=> To my mind the culprit is the scarcity of monetization opportunities for infrastructure components, which pushes talent (candidates for "being responsible") to elsewhere where money is. I mean, nginx/varnish developers should earn 100s of millions; according to their merit. People are making billions by using ruby, python, rails, sinatra, django, debian, varnish, nginx, openssl, and hundreds of libraries for libtool... but their everyday creators don't get enough reward back to feel responsible. Passion and hobby, benevolence or PR are the only drivers for most infrastructure and library development. Some developers lose their faith in a man's lifespan, fight back or quit in full despair.
A restatement of the old dichotomy: the rebel alliance of young, creative, anarchist hackers who are more interested in fun versus the empire of old, methodical, hierarchical business-programmers who bean-count every line of documentation and analysis.
I doubt the dichotomy is between the cathedral and the bazaar.
All software projects have a list of gatekeepers or maintainers, who decide which changes should go in and which ones shouldn't. Some of them have a long list of people with commit access (like Subversion), while others have a handful of maintainers (like Linux). Some of them have grand roadmaps and bug trackers (like Firefox), while others have no agenda or bug trackers (like Git). Some projects have many dependencies, use libtool and a lot of auto-magic (like Subversion), while others are still adamant about not having a configure script, maintaining a handwritten Makefile, and having 1~2 dependencies (like Git).
From what I've noticed, as a general rule of thumb, the projects with few strict maintainers are better off than those who give away commit access easily/ have lazy maintainers. It seems obvious when I put it like that.
The way I understand it, the cathedral model is about carefully planning out the project, assigning various people chunks of the total work. The bazaar model is open to accepting random contributions from a wider audience, and runs on a looser agenda. It's just that the cathedral model works better for certain kinds of software (games, office suites, professional audio/ video suites), while the bazaar model works better for other kinds of software (web servers, version control systems, development toolchains). When it comes to an operating system, having a curator decide APIs and standards (OS X) certainly wins over an mashup of billions of packages (Debian).
I hadn't realized that autoconf used M4. M4 is amazingly hard to work with, mainly because it uses ` and ' as delimiters, making it very hard to even read unless you have syntax highlighting set up to show those quotes as different characters.
It's a matter of perspectives. The article author's one is that of taste. The bazaar folks' (including web startup folks)is that of practicality. To illustrate using the same example from the piece - so what if a bunch of crypto code is copy / pasted? It's all out in the open - if anyone ever comes up with an actual problem with the code copy / paste, how hard can it be to fix it?
Matter of the fact is that the world doesn't have a critical mass of people with agreeing tastes and fundamentals to get them to come together and build something moderately complex.
What we do have however is a world full of reasonable people who just want to build something and make it work so they can solve some problem of theirs. There are also people who will happily reuse what the folks before them built - improving it in the process.
If we were to impose a high barrier to entry for people to code and design systems in absolute best possible way - we are effectively saying only a few will build software for the whole world. That'd be a net loss IMHO.
The point about bad design is worth arguing when software doesn't do what you want it to do, i.e. fails practice. But that is mitigated by the fact that individual stall in the bazaar does enforce some design, some testing, some sort of sanity to ensure it at least works most of the times.
Is it a sad situation - yes, we would be much better off with every programmer being perfect. But so long as that is impractical the bazaar alternative works in creating tons of fixable, mostly usable, continuously improved software - for which I can hardly call it a lost cause.
People are getting confused between two different categories for software.
Software License metric: [F] Free and Open Source, [P] Proprietary/Closed source
This metric can also be modeled as a continuous variable rather than a discrete variable. But let us stick to two values for simplicity.
Development model metric: ranges from extremely [C] Cathedral-type, .........., to extremely [B] Anarchism/Bazaar-type
FOSS proponents don't care about development model as long as it's FOSS.
Let [x][y] denote the Software License metric (x) and Development model metric (y) of a software project.
Observations:
1. [P][C] is the combination that FOSS proponents hate the most.
2. [x][B] where x ∈ {F, P}; is less peer-reviewed (anyone can commit anything), so less accountability/responsibility, highly decentralized, so no guarantee of quality.
2.1. [P][B] sounds like a contradiction!
2.2. [F][B] Poul-Henning Kamp seems to have problems with this kind of setups.
3. [F][y] where y -> B (i.e. closer to [B] than it's closer to [C]). Mostly same as [F][C] except that it is mildly better.
4. [F][C] In this setup, the cathedral authority is the bottleneck in improving the project.
5. [F][y] where y -> C (i.e. closer to [C] than it's closer to [B]). This kind of setup "works like a charm!" See the overall success of GNU/Linux in the industry! There are some people who act as maintainers of Linux, but come on, you too can become one! The more popular such a software is the more thoroughly it is reviewed. "Given enough eyeballs, all bugs are shallow."
This comment thread is a bit depressing, but ignoring that.
One of the bits about the piece that has me scratching my head a bit is whether the mess that is dependency management in OSS operating systems (and generally OSS software distribution models) matters _enough_. While I very much share the author's reaction of "can't we do better?", it also feels that optimizing that mess isn't just a design exercise as much as a social one, because OSS isn't really a bazaar but a constellation of connected more-or-less-cathedralish bazaars. And I don't know that this mess is deeply problematic (although the author does find some egregious issues).
Design happens at a specific scale, and some scales reward investment in design more than others. Designing a chair that can be mass-produced is more effective than designing a room, which can't. One could argue that designing a self-contained piece of software (e.g. the Python runtime) matters more than designing a deliberately open system (e.g. the ecosystem of Python libraries), and that the alternative (random competition, forking, etc.) is Good Enough.
[for carbon dating: had my first IT job in the early 80s, as a teenager]
[+] [-] phkamp|13 years ago|reply
First: It would be a big help for this discussion, if we could have the informal convention that people who were employed in an IT job before 1990 marked their post. I think it would show a quite clear divergence of attitude.
Second: It's very obivious, that a lot of you have never been anywhere near the kind of software project posited in the "cathedral" meme, instead you project into the word whatever you have heard or feel or particularly hate. That's not very helpful, given that there is an entire book defining the concept (I belive it's available online on ESR's homepage, how about you read it ?)
Third: No, I'm not of the "either you are with us, or you are against us" persuation. The bazaar is here to stay, but having everybody in need of transportation buy the necessary spareparts to build a car is insane.
Fourth: Related to point two really: A lot of you seem to have little actual ambition of making things better, I guess that is what happens if you grow up in a bazaar and never even experience a cathedral. I pity you.
[+] [-] jballanc|13 years ago|reply
Second: I worked on OS X. In fact, I worked on Snow Leopard, and the start of Lion. What's interesting about that, is that Snow Leopard was the last version of OS X developed according to the "Cathedral" model. Also, while the Snow Leopard cathedral was being built, iOS was being developed firmly using the bazaar model...
Third: You know, I wonder if you've ever been to a bazaar before? I live in Turkey, where the bazaar is a way of life (and where one of the largest, oldest bazaars in the world is located). I've never found "spare parts" at a bazaar. What I have found is some of the highest quality jewelry, tapestries, rugs, and other hand-made goods you'll find anywhere.
Fourth: I think you're conflating "good quality"=>Cathedral and "poor quality"=>Bazaar. The only thing that distinguishes the Cathedral and the Bazaar is whether or not there is one single individual in whose head the only valid vision of the completed project exists. You might do well to read up a bit on the history of Kapalıcarşı. Throughout its history there were guilds to enforce authenticity and all manner of quality control mechanisms. It is possible to have a Bazaar and a very high quality product.
[+] [-] toyg|13 years ago|reply
Their inscrutable beauty is buried under tons of libraries nobody will ever touch for fear of breaking 20 years of development efforts, exactly like what happens in the Unix world. Their move to the 64-bit world was painfully slower than what their fellow merchants accomplished in the Unix bazaar. They still provide compatibility layers for programs built with technologies that have been thought of as extinct, like monks still praying to the gods of ancient Greece. Whenever they went for the "total reuse" mantra, they built terrible and insecure specifications (DCOM) that still saddle us 20 years later. And let's not even talk about portability, which is anathema: to each his own Cathedral and his own Faith, touch ye not any unbelievers!
So yeah, making mistakes and keeping around the cruft is something every long-running IT project can experience. Unixes are, arguably, the longest of them all, so the ones that naturally tend to show it more. Besides, there's a whole new world of applications to be built out there, if we were rewriting libtool every three months we'd move even more slowly than we do now.
<ad-hominem>Oh, and I wish I could say your rant is unbefitting of professionals of your age, but I'm afraid it's actually quite matching the grumpy-old-man stereotype you're clearly striving for.</ad-hominem> Hey look, I can do ad-hominem too, and I was born in the late 70s!
[+] [-] ChuckMcM|13 years ago|reply
It is important to note the distinction between FreeBSD's package system and say Debian apt. In FreeBSD I can make from source some package, and in Debian I can apt-get install a package, because the Debian packages are prebuilt it just comes over in several chunks and doesn't need to build. (yes you can pull prebuilt packages for FreeBSD too). But my point is that the packaging system of FreeBSD, as used, conflates building the packages and using the packages.
So if I write a perl script that goes through the source code and changes all the calls to fstat() to match your configuration then to build I need perl even though you do not need perl. (as an example). But to run I don't care if you have perl or not.
But lets get back to the volunteer/paid thing again. People who volunteer rarely volunteer to clean the shit out of the horse stalls, no they volunteer to ride in the races and build a new horse. So you end up with a lot of stuff around you don't want.
Sadly for operating systems, and the original rant is really operating systems, there really isn't a cathedral/bazaar model, its more of a democracy / feudalism kind of thing. Nobody 'owns' the final user experience for the OS in FreeBSD/Linux discussions.
[+] [-] chimi|13 years ago|reply
Of course, I'm biased, because I have such a degree. However, when I compare what I build to what is built by a business school grad who thinks anyone can learn computers, so everyone should be taught business, I really cringe at loss of formality in the industry.
Needing to build a computer processor from AND and OR gates really drives the concepts home. Building an operating system in C++ really drives the concepts home and creates a framework for thinking about computer based problem solving that's lost in many of the systems I look at today.
[+] [-] gonzo|13 years ago|reply
My company ships quite a bit of FreeBSD (as pfSense). Could have gone linux, but linux is a mess (much worse than ports.)
I think OpenBSD is a mistake, at best it belongs as a group focused on security inside the netbsd project, but of course, Theo got kicked out of netbsd, thus: OpenBSD.
I'm also the guy who appointed Russ Nelson the "patron saint of bike shedding." Just FYI. :-) (None of Eric Raymond, Russ Nelson or Theo de Raat like me much.)
The first time I read of 'Cathedral & Bazaar' I thought ESR was illustrating and contrasting the BSD .vs linux development model. Only later did I understand that he was pointing fingers at GNU/FSF, not BSD.
[+] [-] xaa|13 years ago|reply
There exist several larger OSS projects, such as Apache, Boost, the Kernel, etc, which accept contributions but are also curated. Thus they represent sort of hybrid between cathedral and bazaar. People who use these projects know they are getting some (varying) standard of quality.
I think these sorts of projects -- often shepherded by some kind of noncommercial Foundation or Organization -- are the best way to get a mix of openness and quality going forward.
[+] [-] davidw|13 years ago|reply
The only example that jumps out at me is the original Unix, and I think you'll agree that comparing software from 30 years ago that does vastly less than ... pretty much anything out there these days is not an entirely fair, nor useful comparison.
[+] [-] jwm|13 years ago|reply
I think people do want to make things better, but its happening much more decentralized, with small teams taking small safe steps on tools, libraries & frameworks.
Media, on-line banking, 3d printing, accessible hardware hacking, e-commerce sites, scientific/engineering software. The majority of these things are made by small teams plugging the best libraries and platforms out there together, where they already solves part of their software problem. Android, Cloud, Linux etc. It would be silly to say these were without ambition and don't contribute to the community.
(The average software devs Impact Factor may be diluted by the increasing number of people working with computers, but computers are so globally useful its inevitable. If good things are still getting made then who cares.)
[+] [-] revorad|13 years ago|reply
[+] [-] ucee054|13 years ago|reply
In this comment I will equate "Rug Market" with "Ready, Fire, Aim" and "Cathedral" with "4 Year Plan".
Firstly, the "Rug Market" beats the "Cathedral" when you haven't formulated the problem properly, and so you have bad specs.
Secondly, the "Rug Market" beats the "Cathedral" when bad software is more profitable than good software. Google for "Worse Is Better" and "The Innovator's Dilemma".
Sad but true.
[+] [-] jng|13 years ago|reply
I've been equally disgusted by the evolution of programming in the last few years. HTML - which is, to a first approximation, always invalid. JS - which needs jQuery to make it mostly-but-not-completely cross-compatible among browsers. CSS, which hides so many hacks on top of each other, and which even needs a reset to be compatible. And now - distributed systems with pieces in PHP, Python, Objective C, Dalvik-Java, etc... and sustained by awesome-but-hackish fixes like Varnish and FastCGI and Nginx. where everything seems to be put togetther with duct tape.
But I'm slowly getting to the next stage, and having to admit - no, actually, admitting - that if it's spread like fire, there must be something to it, no matter if it hurts our taste so much.
And if you look well into it, it's very similar to biological evolution. Our own genetic codes and body plans are full of ancient pieces that are not needed any more (or at all, male nipples anyone), but it seems that it was more economical to "patch" things than to fix things properly. Or at least, it didn't hinder the current designs enough that they wouldn't succeed over the alternatives evolution probably also tried.
And now Google translates text statistically, without even trying to understand things.
One fear I have is that we will be able to create a working AI in a few years... and due to the way we do it, we may even not understand how it works.
[+] [-] lerouxb|13 years ago|reply
I find jQuery more analogous to tools like configure and autoconf. (in that they aren't actually needed for "modern", standards compliant browsers.) As an example: there are already many lightweight drop-in replacements that assume a sane browser to begin with.
CSS is remarkably hack-free and the resets are just to remove the default styling that browsers have built in. Obviously browsers need builtin styles otherwise all the old pre-css pages would stop working.
And why are you calling programming languages and web servers hackish? How is nginx hackish and apache not? How is Objective C hackish and C++ not? Why stop inventing new languages at assembly, c or c++?
Other than that I generally agree with the rest of your comment. I think if we ever get to some kind of technological singularity we'll almost certainly and just about by definition not understand how it works. And true/strong AI would definitely be a singularity. Even if we understood version 1 we would likely not be able to understand whatever it dreams up 5 seconds after we overclock it. And we will - moor's law and all that. ;)
[+] [-] heretohelp|13 years ago|reply
This comment seems to mostly be, "I'm going to complain about software disciplines and lines of CS research that I don't understand because their practical approaches don't conform to my finicky aesthetics."
If you want to formulate Strong AI to solve NLP as a problem, go for it. The rest of us are happy to use Google as it is.
[+] [-] chwahoo|13 years ago|reply
He's probably right that lots of software could be designed better, but I think he's wrong that that's of paramount importance. We need lots of software these days and we simply don't have the resources to built it to Kamp's standards. Also, we've learned that our requirements change so fast that his beautiful design would quickly be twisted into the pile of hacks that he hates.
He notes how long it takes to compile the software that runs on his work machine---how long would it take to compile the software on a windows machine? (Or whatever he would claim is the standard-bearer of his cause).
He also attacks open-source software. The truth is, we have an amazing amount of free and open source software available. Some of it may be flawed in design or usability, but it enables us to solve so many problems (and look at/modify the software when needed). I don't think that all software needs to be "free", but I do think free software has made us all much richer. I have a hard time seeing all this value that the Bazaar has created as inferior to slow-moving, centralized, big, up-front design development.
[+] [-] phkamp|13 years ago|reply
You seem to assume that cathedrals are "slow-moving, centralized, big up-front" designs, where did you get that idea ?
Ever looked into how USA put a man on the moon ?
Maybe you should. Also: Read Brooks book, if you can.
[+] [-] tjr|13 years ago|reply
Why don't we employ rigorous software engineering principles to, say, iOS games? It would be a waste. It could be done, but in practice, people just don't care as much if a game on their phone crashes as they do if an airplane they're riding in crashes.
There's a wide spectrum of software out there, needing varying levels of robustness. There's not a lot that can be said about software development as a whole along these lines; inadequate design for an online banking site might be excessive design for a llama costume competition voting site.
[+] [-] calinet6|13 years ago|reply
If you try too hard to design a rigid structure around anything, it can come back and bite you in the ass. Part of the reason why this seemingly disorganized and haphazard collection of buildings makes a working and thriving metropolis is because of its diversity and resilience to change and progress. If you attempt to over-engineer something of such complexity, you might just end up with Pyongyang instead of New York. Instead, we have a competing ecosystem of libraries and options with the good ones theoretically rising to the top. Better communities and communication of these values make this process work even better, just as in the greater economy. For all its complexity, this process works surprisingly well.
For all the explaining Kamp did in this article, the one thing he failed to account for was the resounding success of the current model. He did use a surprisingly apt metaphor, however. Cathedrals are no more; their old place as the center of society had to be explained by a series of myths and lies that placed ideals above reality. The thriving Bazaar easily replaced them at the center of any thriving modern city, based on the simple truth that the dynamic edge of reality was ever-changing, and that quality based on that reality would indeed be more successful.
Could it be better? Of course. But it is reality and truth that will move us forward—not mythology. Good riddance to cathedrals.
[+] [-] phkamp|13 years ago|reply
Have you never wondered about their surprisingly coherent APIs and wondered why that was so different from, say, UNIX ?
See my other comment about knowing what a cathedral is to begin with.
[+] [-] tedunangst|13 years ago|reply
Citation needed. :) What metric did you use to measure the efficiency of the software bazaar?
[+] [-] DanBC|13 years ago|reply
Efficient?
[+] [-] bguthrie|13 years ago|reply
Since 2001, we've started in on the XP/Agile movement, which, think what you will of it, test-drives and version-controls relentlessly, and fosters a constant dialog about what quality is and how to achieve it.
Furthermore, some marvelous tools have been written in the last decade; I can hardly see how the author can complain about version-control systems, as Git and Mercurial--both miles better than what was available in 2001--are not least among them.
Other quality-enhancing things that have happened since 2001: continuous integration, build pipelines, Selenium and friends, behavior-driven development, REST architecture triumphant over the cathedral-like SOAP and XMLRPC, JSON over XML, lightweight messaging, the adoption of small, focused open-source tool over "quality-focused" large vendor bullshit-enterprise tools.
To the extent that UNIX sucks now, which it doesn't, it's because the hackers who work on it now have lagged the industry--not the other way around. So go find some other group of kids to shoo off your lawn.
[+] [-] phkamp|13 years ago|reply
[+] [-] luckydude|13 years ago|reply
I found the autoconf comments amusing, we have supported a pretty broad range of platforms, from arm to 390's, IRIX to SCO, as well as Linux, Windows, MacOS, and our configure script is 157 lines of shell. Autoconf had its place but I think it was much more useful in the past and now it is baggage. The fact that it is still used (abused?) as much as it is sort of speaks to phk's points.
I think the jury is still out on whether the bazaar approach is better or not. It sure is messier but it also seems to adapt faster. I worked at cathedral places like Sun, and while I continue to emulate their approach I also question whether that approach is as nimble as the bazaar approach.
I've voted for the older style of more careful development and I think it has worked pretty well for us, we can support our products in the market place and support them well. The cost of that is we move more slowly, we tend to have the "right" answer but it takes us a while to get there. Bazaar approaches tend to come up with answers of varying quality faster.
It's really not clear to me which way is better. I'd be interested in hearing from someone / some company that is supporting a lot of picky enterprise customers with some sort of infrastructure product (database, source management, maybe bug tracking) and making a success of it with a bazaar approach. Seems like it would be tough but maybe I'm missing some insight.
[+] [-] acdha|13 years ago|reply
The way to address oversights in the bazaar model isn't to cram everyone back into the cathedral but to build support for change by showing where something is clearly better.
One area where PHK might see this is the way Linux has flown past FreeBSD - not due to endlessly-debated questions of kernel superiority but rather because Linux distributions like Debian provided a clearly superior experience for the overhall system by rejecting the decades of accumulated hacks (i.e. the ports system, monolithic config files, etc.) which had been the status quo for years and building better tools to reduce management overhead.
[+] [-] hollerith|13 years ago|reply
Debian contains tons of accumulated hacks that I wish would be rejected.
[+] [-] flatline3|13 years ago|reply
See also: MySQL vs. PostgreSQL.
[+] [-] wglb|13 years ago|reply
A lot of the commentary seems to be confusing the waterfall model with the idea of the cathedral.
Another way to make this point is to note how few programmers these days read Dijkstra, or Knuth's TAOCP. (Knuth would have us believe that even he doesn't read it. See Coders At Work.) Among other things, Dijkstra taught understanding the entire program before setting down one line of code. Contrast this with TDD (which, believe it or not, some take to mean Test Driven Design).
Lately I have been in the Application Security business, and nowhere has the issue highlighted by the article been more obvious.
Edit: Mark Williams Company, inventor of Coherent OS, was not a paint company. It started out as Mark Williams Chemical Company, manufacturing Dr. Enuff, a vitamin supplement.
[+] [-] wonderzombie|13 years ago|reply
[+] [-] cs702|13 years ago|reply
But Evolution copes better than intelligent, top-down design with the evolving constraints of a market landscape that is constantly shifting.
[+] [-] rektide|13 years ago|reply
Article: This is a horribly bad idea, already much criticized back in the 1980s when it appeared, as it allows source code to pretend to be portable behind the veneer of the configure script, rather than actually having the quality of portability to begin with. It is a travesty that the configure idea survived.
Good high-minded notions here. But configure, with it's standardized parameters for how to do stuff, is near irreplaceable at this point. Certainly a more stripped down version, one not built on M4, would be wise, but libtool/autoconf itself is used too broadly & with trepid familiarity by developers & upstream maintainers: in spite of so much of it being indeed old deprecated no longer useful cruft, the best we can hope for is duplicating a wide amount of the existing functionality in a cleaner manner.
But at what cost would reimplementation come? How many weird build targets would for months or years go unnoticedly broken?
The place where we escape these painful histories is where we leaving the old systems programming languages behind. Node's npm I'd call out as a shining beacon of sanity, enabled by the best most popular code distribution format ever conceived: source distribution, coupled with a well defined not-completely-totally-batshit algorithm for looking for said sources when a program runs & at runtimes goes off to find it's dependencies: http://nodejs.org/docs/latest/api/modules.html#modules_all_t...
[+] [-] justincormack|13 years ago|reply
A lot of the problems are due to poor integration between languages, so for example the JVM people have reimplemented almost everything, as have the C++ people.
[+] [-] yungchin|13 years ago|reply
I'm not sure I understand how the Peter Principle applies here? Autoconf seems a rational solution in the economic sense: to standardise Unix, you need to have lots of influence to effect buy-in, whereas to write autoconf, you need a big dose of hacking talent. If you have the latter and not the former...
[+] [-] engtech|13 years ago|reply
[+] [-] caf|13 years ago|reply
[+] [-] lmm|13 years ago|reply
If anyone ever managed to write a genuinely better build system, the bazaar would let it rise to the top; the gradual rise of e.g. cmake is testament to this. Trying to impose one solution top-down, e.g. the LSB standardization of RPM, has a far worse track record than letting the bazaar do its thing.
[+] [-] tedunangst|13 years ago|reply
As for doing its job well, the failure mode of configure "you can't build this" is abysmal. Just give me a fucking Makefile, I'll fix it myself. I love packages that come with Makefiles that don't work. I pop "-I/usr/local/include" into CFLAGS, run make again, and boom. Done. Trying to do the same with configure? Forget about it. --with-include-dir or whatever doesn't work, because it's really running some bogus test in the background which expects /bin/bash to exist and so on and so forth.
[+] [-] secure|13 years ago|reply
However, in the past I’ve had the opposite experience: Trying to port software such as Apache, PHP or bacula to UNIX systems such as SGI IRIX, I always ended up writing a simple Makefile to compile the software instead of putting up with the multitude of autotools-fixing that would have been required. I reported one or two clear issues upstream and they have been fixed, but until the relevant fixes arrive at the projects (especially PHP came with an old version of autotools), some time will pass.
As a counter-example, take i3-wm: it ships with a GNU Makefile (OK, multiple Makefiles in subdirectories, one for each tool) and compiles on Linux, Mac OS X, FreeBSD, OpenBSD, NetBSD. Now let’s have a look at each of these:
• NetBSD: No patches required to the makefiles: http://cvsweb.se.netbsd.org/cgi-bin/bsdweb.cgi/wip/i3/patche...
• FreeBSD: No patches required to the makefiles: http://www.freebsd.org/cgi/cvsweb.cgi/ports/x11-wm/i3/files/ (they do their usual change to /usr/local in the ports Makefile)
• OpenBSD: several patches because OpenBSD lacks SHM support, doesn’t want (?) to use pkg-config at compile-time and quite a few bugfix backports: http://www.openbsd.org/cgi-bin/cvsweb/ports/x11/i3/patches/
I would argue that porting i3-wm to another platform is easy because you can understand the Makefiles and it’s very clear what they do.
As a conclusion, I just wanted to show you that there are counter-examples to both: situations where autotools really does not do a good job and situations where you can deliver good Makefiles without using autotools at all.
[+] [-] __alexs|13 years ago|reply
The irony in all of this of course is that autoconf was created to solve the portability issues between platforms that came from the Cathedral model.
[+] [-] norswap|13 years ago|reply
[+] [-] fluidcruft|13 years ago|reply
[+] [-] zvrba|13 years ago|reply
I guess you're running Linux, right? I'd be VERY interested to see this ratio if you ran Solaris.
[+] [-] pnathan|13 years ago|reply
[+] [-] diminish|13 years ago|reply
1) Varnish does not support SSL, which complicates deployment architectures; and part of the the blame goes to lack of a good ssl implementation to copy from or to write an SSL proxy; https://www.varnish-cache.org/docs/trunk/phk/ssl.html I find this as an example of why neither bazaar nor cathedral based approaches yielding something "acceptable" in past 15 years, to copy and paste from.
2) Varnish VCL, as a DSL looks as obscure M4 macro language and the functions are not well planned; and another guy (from the bazaar) would design some more coherent language.
=> To my mind the culprit is the scarcity of monetization opportunities for infrastructure components, which pushes talent (candidates for "being responsible") to elsewhere where money is. I mean, nginx/varnish developers should earn 100s of millions; according to their merit. People are making billions by using ruby, python, rails, sinatra, django, debian, varnish, nginx, openssl, and hundreds of libraries for libtool... but their everyday creators don't get enough reward back to feel responsible. Passion and hobby, benevolence or PR are the only drivers for most infrastructure and library development. Some developers lose their faith in a man's lifespan, fight back or quit in full despair.
[+] [-] anvandare|13 years ago|reply
[+] [-] artagnon|13 years ago|reply
All software projects have a list of gatekeepers or maintainers, who decide which changes should go in and which ones shouldn't. Some of them have a long list of people with commit access (like Subversion), while others have a handful of maintainers (like Linux). Some of them have grand roadmaps and bug trackers (like Firefox), while others have no agenda or bug trackers (like Git). Some projects have many dependencies, use libtool and a lot of auto-magic (like Subversion), while others are still adamant about not having a configure script, maintaining a handwritten Makefile, and having 1~2 dependencies (like Git).
From what I've noticed, as a general rule of thumb, the projects with few strict maintainers are better off than those who give away commit access easily/ have lazy maintainers. It seems obvious when I put it like that.
The way I understand it, the cathedral model is about carefully planning out the project, assigning various people chunks of the total work. The bazaar model is open to accepting random contributions from a wider audience, and runs on a looser agenda. It's just that the cathedral model works better for certain kinds of software (games, office suites, professional audio/ video suites), while the bazaar model works better for other kinds of software (web servers, version control systems, development toolchains). When it comes to an operating system, having a curator decide APIs and standards (OS X) certainly wins over an mashup of billions of packages (Debian).
[+] [-] engtech|13 years ago|reply
Here's an example from the web:
[+] [-] blinkingled|13 years ago|reply
Matter of the fact is that the world doesn't have a critical mass of people with agreeing tastes and fundamentals to get them to come together and build something moderately complex.
What we do have however is a world full of reasonable people who just want to build something and make it work so they can solve some problem of theirs. There are also people who will happily reuse what the folks before them built - improving it in the process.
If we were to impose a high barrier to entry for people to code and design systems in absolute best possible way - we are effectively saying only a few will build software for the whole world. That'd be a net loss IMHO.
The point about bad design is worth arguing when software doesn't do what you want it to do, i.e. fails practice. But that is mitigated by the fact that individual stall in the bazaar does enforce some design, some testing, some sort of sanity to ensure it at least works most of the times.
Is it a sad situation - yes, we would be much better off with every programmer being perfect. But so long as that is impractical the bazaar alternative works in creating tons of fixable, mostly usable, continuously improved software - for which I can hardly call it a lost cause.
[+] [-] meta-coder|13 years ago|reply
Software License metric: [F] Free and Open Source, [P] Proprietary/Closed source
This metric can also be modeled as a continuous variable rather than a discrete variable. But let us stick to two values for simplicity.
Development model metric: ranges from extremely [C] Cathedral-type, .........., to extremely [B] Anarchism/Bazaar-type
FOSS proponents don't care about development model as long as it's FOSS.
Let [x][y] denote the Software License metric (x) and Development model metric (y) of a software project.
Observations:
1. [P][C] is the combination that FOSS proponents hate the most.
2. [x][B] where x ∈ {F, P}; is less peer-reviewed (anyone can commit anything), so less accountability/responsibility, highly decentralized, so no guarantee of quality.
2.1. [P][B] sounds like a contradiction!
2.2. [F][B] Poul-Henning Kamp seems to have problems with this kind of setups.
3. [F][y] where y -> B (i.e. closer to [B] than it's closer to [C]). Mostly same as [F][C] except that it is mildly better.
4. [F][C] In this setup, the cathedral authority is the bottleneck in improving the project.
5. [F][y] where y -> C (i.e. closer to [C] than it's closer to [B]). This kind of setup "works like a charm!" See the overall success of GNU/Linux in the industry! There are some people who act as maintainers of Linux, but come on, you too can become one! The more popular such a software is the more thoroughly it is reviewed. "Given enough eyeballs, all bugs are shallow."
[+] [-] daa|13 years ago|reply
One of the bits about the piece that has me scratching my head a bit is whether the mess that is dependency management in OSS operating systems (and generally OSS software distribution models) matters _enough_. While I very much share the author's reaction of "can't we do better?", it also feels that optimizing that mess isn't just a design exercise as much as a social one, because OSS isn't really a bazaar but a constellation of connected more-or-less-cathedralish bazaars. And I don't know that this mess is deeply problematic (although the author does find some egregious issues).
Design happens at a specific scale, and some scales reward investment in design more than others. Designing a chair that can be mass-produced is more effective than designing a room, which can't. One could argue that designing a self-contained piece of software (e.g. the Python runtime) matters more than designing a deliberately open system (e.g. the ecosystem of Python libraries), and that the alternative (random competition, forking, etc.) is Good Enough.
[for carbon dating: had my first IT job in the early 80s, as a teenager]