This reminds me on 'Make vs. X' trend some years ago, where Make would be bashed how is slow, not extensible, had weird syntax and incompatible between implementations. So we got alternatives like Aegis[1], Scons[2], A-A-P[3] or Jam[4] instead, which were much faster and flashier on paper.
But guess what, Make is still kicking, GNU Make got a bunch of new goodies (Guile scripting or loadable modules to name a few) and all those alternatives are pretty much dead now. What I see now is that we are repeating the same mistakes only with flashier tools (node, python3, rust).
Although I always preferred Jam[4], I'm pretty happy with GNU Make now. Not perfect, does the job well and if I ever hit some weird platforms, I can always 'extend' myself to Autotools[5]. Funny thing, I'm even using Make to run Ansible scripts or compile Java/Clojure code and works like a charm.
I suspect a large amount of the issue is sheer availability.
There's an odd proclivity of developers of build tools to assume that their favorite runtime (be it python for scons, etc) is already available or trivial to nicely install. And that's just... not true.
The build tool itself is the one place in the ecosystem of building software that we have the least room for manual dependency handling and manual setup, because there's nowhere left to punt the hard part.
Make isn't better at this either, but it's been so widely packaged and is so nearly-default available (it's remarkably hard to get a working desktop system without some transitive dependency having gotten make installed for you) that it defacto gets a free pass on this.
My hypothesis is that these newer build tools would have a much better shot at reaching adoption if they were well-packaged enough that a single tarball (with zero external deps) or a single-line install script (think: "gradlew", though even that was cheating by assuming an existing jvm) could bootstrap them. Most don't seem to have invested that effort. And it shows.
I'm interpreting your comment as implying that it's somehow a waste of time to develop alternatives to an incumbent that just won't evolve. I'm not sure if that's the point you were trying to make or not, but I think it's important to call out that perhaps Make was improved precisely because it was challenged? And whether or not this was the case for Make, it has certainly been the case for other technologies.
Yes, agreed. I went through SCons, Jam, CMake, and a few others. I’ve made projects fully with autotools and manually maintained IDE projects. Make compares well against most of these other tools, especially if you are using GNU Make.
However, I don’t use Make as much these days and tend to use one of two different replacements… either a Ninja build file generated by a Python script, or Bazel. Ninja solves the problems I have writing makefiles (rebuild when args change, rules with multiple outputs, create output directories) and gets out of the way. Bazel has a much higher level of complexity but for complex, multi-language projects I can’t see not using Bazel as a viable option these days.
I totally agree with you. Despite not having much experience with the tool, I recently used Make and M4 in a project to build Docker images for multiple architectures [1] and it worked really well.
I found the Build Systems a la Carte¹ paper, and NDM’s companion write up² to be good comparison of build systems in a more general manner.
Caveat: It may skewed having been written by heavy Haskell hitters, and an author of Shake. I’ll note that I didn’t notice any bias, but maybe only because it aligned with mine ;)
What I've seen of Tup is good but it doesn't add enough to convince me to switch away from make which is already everywhere.
CMake and similar tools don't attract me because they're not language independent. The best thing about make (and Tup too) is that it lets you express dependencies and ways to satisfy them in terms of other tools.
Now what make is not good at is being ./configure. I'd like to see a more elegant autoconf-like tool to detect things about the environment and generate configure.{h,mk}.
CMake actually is language independent, but it so happens that only C, C++, Assembly, Fortran are first class citizens. Java and Ada are second class. Anything else, nobody wrote it. You might find a package to find the compiler and dependencies perhaps.
I’ve used tup in a previous project. I was expecting issues that I couldn’t predict before getting my hands dirty, but it was actually a very smooth transition.
For complex C and C++ projects, I can recommend tup. I’ve also used it for Latex documents.
premake¹ feels like an elegant autoconf alternative to me, and can be combined with various build systems including make. The syntax is far nicer than cmake too, but frankly that bar isn’t very high.
Sadly, it isn’t used by all that many projects² which holds me back from wanting to depend on it. I have played with it in a few personal projects though, and have been impressed.
My two favorite make alternatives make HN in two consecutive days.
I like redo because with make you need to know two languages, with redo you know just one, the interface is simple and you get automatic dependency on build rules for free. Also the implementation is so much simpler than make, and yet I've not run into a feature I miss from make.
I like tup because it is very opinionated and forces you to write your build scripts in a reasonable manner, while also being very fast and very correct. It might be possible to write a tup file that incorrectly handles dependencies, but you'd have to work hard to do so.
Anyone aware of an npm library that wraps tup to provide automatic dependency graph construction from objects changing on filesystem?
I'd could use a javascript api that enabled automatically detecting filesystem change dependency graph associated with each of a given set of javascript function calls ... Perhaps with simplifying assumption that all function parameters are serializable or maybe even that the functions being tracked are only allowed to operate on strings that represents filesystem path's ...
I'd love for the functionality implemented within tup of running a command and automatically doing the low-level kernel hacking necessary to track all the filesystem objects read/written to be abstracted into an application-level library ...
That was my take as well, and I honestly don't get the downvotes.
More importantly, in large projects the bulk of the comptational budget is spent actually compiling source files. It's hard to believe that picking which file neess to be compiled next takes more time than actually compiling it.
I'm all for good build systems and I have used make quite a bit but the problem in moving to a better build system is that makefiles are so convoluted and hard to reason about that nobody wants to be blamed for breaking the build by migrating.
I wish there were a testing system for builds. Specify updating what should change what and check timestamps (to begin with) of created files.
That's just a start. Tup is much less flexible and forces certain conventions on the build system.
It is also much more rigid about introducing dependencies during the build.
[+] [-] dig1|7 years ago|reply
But guess what, Make is still kicking, GNU Make got a bunch of new goodies (Guile scripting or loadable modules to name a few) and all those alternatives are pretty much dead now. What I see now is that we are repeating the same mistakes only with flashier tools (node, python3, rust).
Although I always preferred Jam[4], I'm pretty happy with GNU Make now. Not perfect, does the job well and if I ever hit some weird platforms, I can always 'extend' myself to Autotools[5]. Funny thing, I'm even using Make to run Ansible scripts or compile Java/Clojure code and works like a charm.
[1] http://aegis.sourceforge.net/
[2] https://scons.org/
[3] http://www.a-a-p.org/
[4] https://en.wikipedia.org/wiki/Perforce_Jam
[5] https://en.wikipedia.org/wiki/GNU_Build_System
[+] [-] heavenlyhash|7 years ago|reply
There's an odd proclivity of developers of build tools to assume that their favorite runtime (be it python for scons, etc) is already available or trivial to nicely install. And that's just... not true.
The build tool itself is the one place in the ecosystem of building software that we have the least room for manual dependency handling and manual setup, because there's nowhere left to punt the hard part.
Make isn't better at this either, but it's been so widely packaged and is so nearly-default available (it's remarkably hard to get a working desktop system without some transitive dependency having gotten make installed for you) that it defacto gets a free pass on this.
My hypothesis is that these newer build tools would have a much better shot at reaching adoption if they were well-packaged enough that a single tarball (with zero external deps) or a single-line install script (think: "gradlew", though even that was cheating by assuming an existing jvm) could bootstrap them. Most don't seem to have invested that effort. And it shows.
[+] [-] weberc2|7 years ago|reply
[+] [-] klodolph|7 years ago|reply
However, I don’t use Make as much these days and tend to use one of two different replacements… either a Ninja build file generated by a Python script, or Bazel. Ninja solves the problems I have writing makefiles (rebuild when args change, rules with multiple outputs, create output directories) and gets out of the way. Bazel has a much higher level of complexity but for complex, multi-language projects I can’t see not using Bazel as a viable option these days.
[+] [-] hectorm|7 years ago|reply
[1] https://github.com/hectorm/hblock-resolver/blob/master/Makef...
[+] [-] thedeepself|7 years ago|reply
[deleted]
[+] [-] JNRowe|7 years ago|reply
Caveat: It may skewed having been written by heavy Haskell hitters, and an author of Shake. I’ll note that I didn’t notice any bias, but maybe only because it aligned with mine ;)
1. https://www.microsoft.com/en-us/research/publication/build-s... 2. http://neilmitchell.blogspot.com/2018/07/inside-paper-build-...
[+] [-] sjmulder|7 years ago|reply
CMake and similar tools don't attract me because they're not language independent. The best thing about make (and Tup too) is that it lets you express dependencies and ways to satisfy them in terms of other tools.
Now what make is not good at is being ./configure. I'd like to see a more elegant autoconf-like tool to detect things about the environment and generate configure.{h,mk}.
[+] [-] AstralStorm|7 years ago|reply
It is very much possible to add other languages.
[+] [-] haolez|7 years ago|reply
For complex C and C++ projects, I can recommend tup. I’ve also used it for Latex documents.
[+] [-] JNRowe|7 years ago|reply
Sadly, it isn’t used by all that many projects² which holds me back from wanting to depend on it. I have played with it in a few personal projects though, and have been impressed.
1. https://premake.github.io/ 2. https://github.com/premake/premake-core/wiki/Who-Uses-Premak...
[+] [-] aidenn0|7 years ago|reply
I like redo because with make you need to know two languages, with redo you know just one, the interface is simple and you get automatic dependency on build rules for free. Also the implementation is so much simpler than make, and yet I've not run into a feature I miss from make.
I like tup because it is very opinionated and forces you to write your build scripts in a reasonable manner, while also being very fast and very correct. It might be possible to write a tup file that incorrectly handles dependencies, but you'd have to work hard to do so.
[+] [-] majewsky|7 years ago|reply
[+] [-] breatheoften|7 years ago|reply
I'd could use a javascript api that enabled automatically detecting filesystem change dependency graph associated with each of a given set of javascript function calls ... Perhaps with simplifying assumption that all function parameters are serializable or maybe even that the functions being tracked are only allowed to operate on strings that represents filesystem path's ...
I'd love for the functionality implemented within tup of running a command and automatically doing the low-level kernel hacking necessary to track all the filesystem objects read/written to be abstracted into an application-level library ...
[+] [-] jp57|7 years ago|reply
[+] [-] geezerjay|7 years ago|reply
More importantly, in large projects the bulk of the comptational budget is spent actually compiling source files. It's hard to believe that picking which file neess to be compiled next takes more time than actually compiling it.
[+] [-] bhengaij|7 years ago|reply
I'm all for good build systems and I have used make quite a bit but the problem in moving to a better build system is that makefiles are so convoluted and hard to reason about that nobody wants to be blamed for breaking the build by migrating.
I wish there were a testing system for builds. Specify updating what should change what and check timestamps (to begin with) of created files.
[+] [-] AstralStorm|7 years ago|reply
[+] [-] geezerjay|7 years ago|reply
Nowadays Makefiles are largely autogenerated by the build system. How many people are actually editing makefiles by hand in non-pet projects?