I find these posts somewhat amusing. We've got people who (rightfully) question the tools they use and look for alternatives. They then discover Make and have some kind of zen Unix moment that they want to share with the world.
If what you are doing in your flavor-of-the-month build tool translates to a roughly equivalent number of lines in Make, then yes, you should probably look at using Make. But the thing is, Make is stupid, it doesn't know a lot. Sometimes that is a good thing, sometimes it is not.
I've written about this before on HN: I mostly program in C++ and when I build my stuff I want a build tool that understands things like header dependencies, optimization levels, shared libraries etc. It's a bonus if my build files are portable.
My point is that these alternative tools often strive to raise the abstraction level and the reason people use them isn't necessarily because they haven't discovered Make.
It reminds me of the jQuery cycle: use jQuery for everything -> decide that depending on frameworks is lame -> use "vanilla JS" for everything -> realize this requires polyfills and various other inconvenient, inelegant things -> either go back to using jQuery, or gain a much deeper understanding as to why everyone uses it.
Amusing indeed. (Functional) Reactive Programming [1] [2] anyone. That's the same thing we've learned during development to be profitable and of real use. And it seems that build systems also converge towards the same lesson learned, but slowly.
Might go a bit off topic but i have to bring this up since 9 out of 10 make tutorials on the internet do the same horrific mistake as you just did, 11 out of 10 code bases out in the wild as well.
In your make file example the .o files are just depending on the .cpp files, not the header files they include, the header files those included header files include and the files they include etc etc. This means nothing will be recompiled/relinked if a constant in a header file changes for example! Changed function signatures will give you cryptic linker errors with the standard solution "just try make clean first".
To solve this you can either manually update the make file every time any file changes the files it includes, which almost defeats the purpose of having an automatic build system. Or you can use automatic dependency generation by invoking your compiler with a special flag (-MMD for GCC), and suddenly make isn't as simple anymore as you laid it out to be. In conclusion your build tool must be aware of ALL inclusion rules as your compiler(preprocessor) has, or be given the information somehow. Maybe it's better to just use something designed for your particular toolchain that can come bundled with this knowledge?
Right. Make is mostly a kludge around the nonexistent module system in C and C++.
It's so bad (specifically due to the way file preprocessing works), that you need to have large parts of a compiler to accurately determine what the dependencies of a source file are.
This is why a decent module system should be the top priority for C++17, though it doesn't look likely so far.
[Edit: Note that these are heredoc examples showing how to create the do scripts.]
These are just shell scripts and can be extended as much as necesary.
For instance, one can create a dependency on the compiler flags with these changes:
cat <<EOF | install -m 0755 /dev/stdin cc
#!/bin/sh
g++ -c "\$@"
EOF
# sed -i 's/^\(redo-ifchange.\+\)/\1 cc/' *.do
# sed -i 's}g++ -c}./cc}' *.do
sed calls could be combined; separated here for readablility.
CXX=g++ isn't necessary either; make already knows about $(CXX) and how to link C++ programs. Also, I think you wanted .o, not o.
And compared to that Makefile, the redo scripts you list don't seem simpler at all. I've seen reasonably compelling arguments for redo, but that wasn't one.
In general, I found CMake quite useable for my needs, and quite clean. It also required less build system code than redo. CMake fits quite nicely into a (C or C++) project which consists of many binaries and libraries which can depend on each other.
redo might be simpler and more reliable, but shell isn't. And redo is encouraging even more work to be done in shell. Additionally, the redo version is more verbose and harder to read. While fancier tasks will make's version look horrible relatively quickly, they won't make redo's version look any better.
A build DSL solves the problem of making your build rules and systems first-class citizens. It's not just learning a new syntax—in fact, since you're embedding into a known language, it isn't even that new—it's about getting more control. You can pass rules around, modify them and do whatever arbitrarily complex tasks you need in a natural, straightforward way using your favorite programming language. You don't have to contort yourself and bend over backwards to fit the logic you want into Make's limited and peculiar language.
Your build system is an integral part of your whole program and you want to treat it just like any other code. This means refactoring, this means modularity, this means libraries, this means no copying and pasting... All this is far easier with a system embedded in your main language than in Make. You can use your existing tooling, debuggers and frameworks to support your build system. If you're using a typed language, you can use the types to both constrain and guide your build files, making everything safer.
Using an embedded DSL integrates far better with the rest of your ecosystem than relying on Make.
Apart from making the logic of your build system easier to describe and maintain, an embedded DSL also makes arbitrary meta-tasks easier. You might want to monitor random parts of your build process, report to different services, connect to different front-ends (an IRC bot, a CI system...) and maybe even intelligently plug into the features of your main language. Wouldn't it be great to have a make system that's deeply aware of how your server is configured, how your type system works, what your compile-time metaprogramming is doing an so on?
You could just glue together a bunch of disparate scripts with a Make file. Or you could use a DSL and call these services through well-defined, maybe even typed interfaces! No need for serializing and deserializing: you can keep everything inside your system.
Sure, if you're just going to use your DSL as a different syntax for Make, you're not gaining much. But it allows you to do far more in a far better way, while fitting in more naturally with the rest of your code. I'm definitely all for it!
Perhaps I'm just jaded but what you describe sounds a lot more complicated for most people than just writing a Makefile. Perhaps even the vast majority of people. I think you underestimate how far you can go with just a vanilla make build system.
Can you show an example of what you are describing? It doesn't sound interesting for the tasks I have in mind, so it must be the case that you are dealing with very complex tasks.
I think everyone goes through a phase where they try to find the perfect build tool, and then at least entertain the idea of writing one themselves.
Eventually, you grow out of it. There's a lot of build tools, each are better at some things than others. It's not that much grunt work to convert things from one to another (even very large projects). If your build tool is working for you, leave it alone. If it's getting in your way or slowing things down, try another one. Move on.
I think one reason is because Make is built with Shell, which is always one step (and one letter) away from hell.
For example:
clean:
rm -rf *o hello
Did you really mean to erase all files and directories that end in "o"? Let's say it's just a typo and fix it: "*.o".
Now, are you sure it'll handle files with spaces in the name? What about dashes, brackets and asterisks? Accents? What if a directory ends in .o? Hidden files?
This specific case may support all of the above. But if it doesn't, what will happen? How long until you notice, and how long still to diagnose the problem?
Just like I prefer static strong typing when developing non-trivial projects, the build system should be more structured. I agree endless reinventing is tiring, but it may have some credit in this case.
I'd expect all developers using make to know about this and never have this problem thanks to one simple thing: sticking with sensible names (no spaces, no brackets, no stars and other special characters in the name - hello underscores!).
It's an easy rule.
Just like I prefer static strong typing...
You probably don't use any special chars or spaces for identifiers in whatever the language you're programming in. This is just applying a similar rule to the files of your project.
For me, the only justification for using a language-specific build tool (e.g. grunt, rake, paver, ...) is when you actually want to exchange data with a library / program written in that language. On the other hand, you could probably accomplish the same effect using environment variables, with the upside of having a cleaner interface.
For those that are curious which build tools exist for Python, here's an (incomplete) list:
* pyinvoke (https://github.com/pyinvoke) - claims to be the successor of fabric, pretty solid and well-maintained
Documentation can be challenging to find, and it isn't the most actively developed project in the world, but what it does, it does pretty well (including supporting more than python dependencies).
the last 10 years in build tools has felt like 1 step forward, two steps back. i like being able to write tasks in any language other than Makefile. however, it seems like many of the new popular options (cake, grunt, etc.) don't do what, to me, is Make's real purpose: resolve dependencies and only rebuild what's necessary. new task runners have either eliminated or pigeonholed the (typically one-to-one in makeland) correspondence between tasks and files, meaning the build system can't be intelligent about what tasks to run and which to not.
computers are fast enough that this doesn't often bother me anymore, but i've run across some huge Rakefiles that could benefit from a rewrite in Make.
> however, it seems like many of the new popular options (cake, grunt, etc.) don't do what, to me, is Make's real purpose: resolve dependencies and only rebuild what's necessary.
You might like tup[1]. Its killer feature is that it automatically determines file-based dependencies by tracking reads and writes (using a FUSE filesystem). It has an extreme emphasis on correct, repeatable builds, and is very fast. Other stuff:
- does work in parallel, and will let you know if your build isn't parallel safe. (note it is NOT relying on your specification of dependencies: even if you manually specify dependencies, it will tell you if something's wrong based on what it actually observes your dependencies to be)
- tracks changes to the build script and reruns if the commands change.
- cleans up old output files automatically if build rules are removed.
- lets you maintain multiple build variants (say for different architectures, configurations, etc)
- autogenerates .gitignore files for your build output
- very easy to get started, and "Just Works".
- for advanced usage, it is scriptable in Lua.
I've tried every build system out there. For Unix-y file-based build tasks, tup is, by far, the best. I don't know why it isn't more well known.
I agree completely, and I think the blame rests with Ant and Java. Java's dependency management was painful enough to deal with in 'make' that Ant was built to support building Java projects. But in doing so the authors threw away the explicit file dependencies that made 'make' so powerful in the first place. Instead it got people to think in terms of a graph of 'tasks', each of which could either figure out its own dependency management, or more commonly ignore them completely. Most tools that followed seem to have gone down the 'graph of tasks' avenue, with 'graph of file dependencies' as an additional mechanism if you're lucky.
The huge Rakefiles you've seen could possibly have simply benefited from a rewrite in Rake. Rake has 'file' tasks which implement the file dependencies of 'make' but for some reason most users of Rake seem to ignore them completely.
It can be worse than that, one build system I looked at built everything every time. Why? Because "Computers are fast enough that trying to figure out exactly what needs to be rebuilt is an anachronism, this can rebuild everything in the time it took that crufty old system to figure out what it actually had to build."
I've given up trying to educate folks, I just make a note to check in with them, 6 months to a year later, to see if they are still building everything.
Another vote for higher-level meta-build-systems like cmake, premake or scons (I'm using cmake because it has very simple cross-compilation support). My personal road to build-system nirwana looked like this, I'm sure this is fairly common:
- Started using hand-written Makefiles and autoconf. Then someone wants to build on Windows, in Visual Studio nonetheless. Add manually created VStudio project files to the project. Then someone wants to use Xcode, so add manually created Xcode project files. Now you add files, or even need to change a compiler option. Fix the options in the Makefile, open the VisualStudio project, fix the options there, open the project in Xcode, fix the options there. Depending on the project complexity, this can take hours. The next guy needs to build the project in an older VisualStudio version, but the project files are not backward compatible...
- Next step was to create my own "meta-build-system" in TCL (this was around 1999), which takes a simple descriptions of the project (what files to compile into what targets, and the dependencies between target), and creates Makefiles, VStudio-files and Xcode-files, this worked fine until the target project file formats change (happens with every new VisualStudio version).
- Someone then pointed me to cmake which does exactly that but much better (creates Makefiles, VStudio-, Xcode-projects, etc... from a generic description of the sources, targets and their dependencies), and I'm a fairly happy cmake user since then.
- Recently I started to wrap different cmake configuration (combinations of target platforms, build tools/IDE to use, and compile config (Release, Debug, etc...)) under a single python frontend script, since there can be dozens of those cmake configs for one project (target platforms: iOS, Android, OSX, Linux, Windows, emscripten, Google Native Client; build tools: make, ninja, Xcode, VStudio, Eclipse; compile configs: Debug, Release). But the frontend python script only calls cmake with the right options, nothing complicated.
Of course now I'm sorta locked-in to cmake, and setting up a cmake-based build-system can be complex and challenging as well, but the result is an easy to maintain cross-platform build system which also supports IDEs.
I general I'm having a lot less problems compiling cmake-based projects on my OSX and Windows machines then autoconf+Makefile-based projects.
I agree with this. As a cross platform (ie, Windows + OSX + Linux) person who enjoys Visual Studio (XCode less so) I need more than make.
My own experience is with gyp and ninja which is used by the Chromium team (http://martine.github.io/ninja/) which they use to build Windows, OSX, Linux, Android (and maybe iOS?)
Of course for personal projects I'll probably never notice the speed difference but for bigger ones Ninja is FAST.
I've recently been playing with ninja, which does a good job of not being 'just another make' http://martine.github.io/ninja/. To quote their website, "Where other build systems are high-level languages Ninja aims to be an assembler.". It's used as a backend for GYP (Google Chromium) and is supported by CMake as well. I've had good success generating the files manually using something like ninja_syntax.py: https://github.com/martine/ninja/blob/master/misc/ninja_synt....
I'm 52 years old. I've had this discussion with dmr, srk, maybe with wnj.
All I know is for years, decades, I carried around the source to some simplistic make. I hate GNU make, I hate some of the unix makes. I loved the simple make.
The beauty of make is it just spelled out what you needed to do. Every darn time make tried to get clever it just made it worse. It seemed like it would be better and then it was not.
Make is the ultimate less is more. Just use it and be happy.
Dunno if the owner of the site will read this, but here's a tip. Don't show a full screen overlay telling me how my visit would be better with cookies enabled.
1) I have cookies enabled.
2) The Eurpoean law is daft, but since you feel you must comply do it in a more user friendly way.
I know. I don't agree with it much either and I found this least intrusive (wasn't aware of issue on mobile). If I can find a better solution, will change.
Hmm, maybe the overlay changed since you posted (11 minutes ago), but all I see is a sticky-slim footer with the cookie mention.
I've always liked how rockpapershotgun.com does it...it also uses a sticky-slim footer, but the text reads: "Rock Paper Shotgun uses cookies. For some reason we are now obliged to notify you of this fact. Not that you care"
Nothing against make, but I've found that it feels really nice when the majority of your toolset uses the same language. This is what I liked about Rails. Rails is ruby. Bundler is ruby. Rake is ruby. It's all ruby, which allows for a certain synergy, streamlined feel, and less cognitive overhead. I don't blame the js folks for attempting something similar.
Agreed. Mixing languages is fine when necessary, but a single language is usually preferable to me. My dev team has been using Grunt in projects thus far and have been pleased with Gulp in small experiments.
Misses the fundamental point that Make is broken for so many things. To begin with you have to have a single target for each file produced. Generating all the targets to get around this is a nightmare that results in unreadable debug messages and horribly unpredictable call paths.
nix tried to solve much of this, but I agree it can't compete with the bazillion other options.
It does not miss it, just ignores it. The author states that there are lots of things we can improve but the point is that we have too many variations on the theme without converging to a solution that has few (or no) dependencies and comes with built-in build knowledge and the ability to discover what you want rather than make you declare it.
Such a tool should be:
- Zero (or few) dependencies. Likely written in plain C (or C++, D, Rust) and compiled to distribute in binary form.
- Cross-platform
- Support any mix of project languages and build tasks.
- Recognizes standard folder hierarchies for popular projects.
- Easy enough to learn. Not overly verbose (looking at you, XML). Similar to Make if possible.
Examples of the auto-discovery: It can find "src", "inc", and "lib" directories then look inside and see .h files then make some educated guesses to build the dependency tree of header and source files (even with mix of C and C++). Or it could see a Rails app and figure out to invoke the right Rake commands, perhaps checking for the presence of an asset pipeline etc. Or a Node.js project. It could check for GIT or SVN and make sure any sub-modules have been checked out.
> To begin with you have to have a single target for each file produced.
Try this next time (only the pertinent lines are included):
SOURCES=$(wildcard $(SRCDIR)/*.erl)
OBJECTS=$(addprefix $(OBJDIR)/, $(notdir $(SOURCES:.erl=.beam)))
DEPS = $(addprefix $(DEPDIR)/, $(notdir $(SOURCES:.erl=.Pbeam))) $(addprefix $(DEPDIR)/, $(notdir $(TEMPLATES:.dtl=.Pbeam)))
-include $(DEPS)
# define a suffix rule for .erl -> .beam
$(OBJDIR)/%.beam: $(SRCDIR)/%.erl | $(OBJDIR)
$(ERLC) $(ERLCFLAGS) -o $(OBJDIR) $<
#see this: http://www.gnu.org/software/make/manual/html_node/Pattern-Match.html
$(DEPDIR)/%.Pbeam: $(SRCDIR)/%.erl | $(DEPDIR)
$(ERLC) -MF $@ -MT $(OBJDIR)/$*.beam $(ERLCFLAGS) $<
#the | pipe operator, defining an order only prerequisite. Meaning
#that the $(OBJDIR) target should be existent (instead of more recent)
#in order to build the current target
$(OBJECTS): | $(OBJDIR)
$(OBJDIR):
test -d $(OBJDIR) || mkdir $(OBJDIR)
$(DEPDIR):
test -d $(DEPDIR) || mkdir $(DEPDIR)
I've been using a makefile about 40 lines long and I've never needed to update the makefile as i've added source files. Same makefile (with minor tweaks) works across Erlang, C++, ErlyDTL and other compile-time templates and what have you. Also does automagic dependencies very nicely.
> Generating all the targets to get around this is a nightmare that results in unreadable debug messages and horribly unpredictable call paths.
If you think of Makefiles as a series of call paths, you're going to have a bad time. It's a dependency graph. You define rules for going from one node to the next and let Make figure out how to walk the graph.
Could you post an example of what you mean by the single target/file limitation? As stated I can't tell how implicit rules or a rule to build an entire directory wouldn't be a solution, but maybe I'm not understanding the problem.
I think the main problem with these articles is that the examples given are exceedingly simplistic, and hence in no way represent real world build systems. It's very easy to have a build system look nice and clean for trivial examples, when it breaks down is when the software it builds gets more complicated and the number of hacks and extra code is added making the build system into a big mess.
I've been thinking a lot about build systems lately. I enjoy the discussion that this post has provoked. The post itself is weaker than it could have been, in that it does not stick to a single example when comparing build tools, and does not pin down any criteria for distinguishing between build tools.
If you are interested in a comparison of a few interesting build tools, please check out Neil Mitchell's "build system shootout" : https://github.com/ndmitchell/build-shootout . Neil is the author of the `Shake` build system. The shootout compares `Make`, `Ninja`, `Shake`, `tup` and `fabricate`.
One big advantage of vanilla Make is the community. There are some very nice tools that work well with make (such as https://github.com/mbostock/smash).
At least we are moving the direction of Grunt/Gulp rather than a maven sort of direction. Many lives lost to maven, somewhat of a Vietnam of build tools. You might think you are a Java developer with it but truly you are a maven servant.
This post rather misses that while Make is simple, making Make do all the things we're used to (e.g. Java dependency management) not as simple.
I'd like to think people have decided that it's easier to replicate the task part of Makefiles onto their environment as the simpler alternative to making dependency management and various other language-specific tasks available to make.
[+] [-] vinkelhake|12 years ago|reply
If what you are doing in your flavor-of-the-month build tool translates to a roughly equivalent number of lines in Make, then yes, you should probably look at using Make. But the thing is, Make is stupid, it doesn't know a lot. Sometimes that is a good thing, sometimes it is not.
I've written about this before on HN: I mostly program in C++ and when I build my stuff I want a build tool that understands things like header dependencies, optimization levels, shared libraries etc. It's a bonus if my build files are portable.
My point is that these alternative tools often strive to raise the abstraction level and the reason people use them isn't necessarily because they haven't discovered Make.
[+] [-] badman_ting|12 years ago|reply
It reminds me of the jQuery cycle: use jQuery for everything -> decide that depending on frameworks is lame -> use "vanilla JS" for everything -> realize this requires polyfills and various other inconvenient, inelegant things -> either go back to using jQuery, or gain a much deeper understanding as to why everyone uses it.
[+] [-] X4|12 years ago|reply
––
[1] http://en.wikipedia.org/wiki/Reactive_programming
[2] http://en.wikipedia.org/wiki/Functional_reactive_programming
[+] [-] Too|12 years ago|reply
In your make file example the .o files are just depending on the .cpp files, not the header files they include, the header files those included header files include and the files they include etc etc. This means nothing will be recompiled/relinked if a constant in a header file changes for example! Changed function signatures will give you cryptic linker errors with the standard solution "just try make clean first".
To solve this you can either manually update the make file every time any file changes the files it includes, which almost defeats the purpose of having an automatic build system. Or you can use automatic dependency generation by invoking your compiler with a special flag (-MMD for GCC), and suddenly make isn't as simple anymore as you laid it out to be. In conclusion your build tool must be aware of ALL inclusion rules as your compiler(preprocessor) has, or be given the information somehow. Maybe it's better to just use something designed for your particular toolchain that can come bundled with this knowledge?
[+] [-] humanrebar|12 years ago|reply
It's so bad (specifically due to the way file preprocessing works), that you need to have large parts of a compiler to accurately determine what the dependencies of a source file are.
This is why a decent module system should be the top priority for C++17, though it doesn't look likely so far.
[+] [-] ArkyBeagle|12 years ago|reply
.depend :
then at the bottom:source .depend
[+] [-] erikb|12 years ago|reply
[+] [-] gyepi|12 years ago|reply
These are just shell scripts and can be extended as much as necesary. For instance, one can create a dependency on the compiler flags with these changes:
sed calls could be combined; separated here for readablility.[1] https://github.com/gyepisam/redux
[+] [-] JoshTriplett|12 years ago|reply
And compared to that Makefile, the redo scripts you list don't seem simpler at all. I've seen reasonably compelling arguments for redo, but that wasn't one.
[+] [-] gcv|12 years ago|reply
In general, I found CMake quite useable for my needs, and quite clean. It also required less build system code than redo. CMake fits quite nicely into a (C or C++) project which consists of many binaries and libraries which can depend on each other.
[+] [-] pekk|12 years ago|reply
[+] [-] GrinningFool|12 years ago|reply
[+] [-] tikhonj|12 years ago|reply
Your build system is an integral part of your whole program and you want to treat it just like any other code. This means refactoring, this means modularity, this means libraries, this means no copying and pasting... All this is far easier with a system embedded in your main language than in Make. You can use your existing tooling, debuggers and frameworks to support your build system. If you're using a typed language, you can use the types to both constrain and guide your build files, making everything safer.
Using an embedded DSL integrates far better with the rest of your ecosystem than relying on Make.
Apart from making the logic of your build system easier to describe and maintain, an embedded DSL also makes arbitrary meta-tasks easier. You might want to monitor random parts of your build process, report to different services, connect to different front-ends (an IRC bot, a CI system...) and maybe even intelligently plug into the features of your main language. Wouldn't it be great to have a make system that's deeply aware of how your server is configured, how your type system works, what your compile-time metaprogramming is doing an so on?
You could just glue together a bunch of disparate scripts with a Make file. Or you could use a DSL and call these services through well-defined, maybe even typed interfaces! No need for serializing and deserializing: you can keep everything inside your system.
Sure, if you're just going to use your DSL as a different syntax for Make, you're not gaining much. But it allows you to do far more in a far better way, while fitting in more naturally with the rest of your code. I'm definitely all for it!
[+] [-] gradstudent|12 years ago|reply
[+] [-] feca|12 years ago|reply
[+] [-] cgrubb|12 years ago|reply
[+] [-] ICWiener|12 years ago|reply
[+] [-] joeld42|12 years ago|reply
Eventually, you grow out of it. There's a lot of build tools, each are better at some things than others. It's not that much grunt work to convert things from one to another (even very large projects). If your build tool is working for you, leave it alone. If it's getting in your way or slowing things down, try another one. Move on.
[+] [-] BoppreH|12 years ago|reply
For example:
Did you really mean to erase all files and directories that end in "o"? Let's say it's just a typo and fix it: "*.o".Now, are you sure it'll handle files with spaces in the name? What about dashes, brackets and asterisks? Accents? What if a directory ends in .o? Hidden files?
This specific case may support all of the above. But if it doesn't, what will happen? How long until you notice, and how long still to diagnose the problem?
Just like I prefer static strong typing when developing non-trivial projects, the build system should be more structured. I agree endless reinventing is tiring, but it may have some credit in this case.
[+] [-] idlewan|12 years ago|reply
It's an easy rule.
You probably don't use any special chars or spaces for identifiers in whatever the language you're programming in. This is just applying a similar rule to the files of your project.[+] [-] ThePhysicist|12 years ago|reply
For those that are curious which build tools exist for Python, here's an (incomplete) list:
* pyinvoke (https://github.com/pyinvoke) - claims to be the successor of fabric, pretty solid and well-maintained
* fabric (http://www.fabfile.org/) - not actually a build tool but often used as one
* paver (http://paver.github.io/paver/) - no longer actively maintained
* doit (http://pydoit.org/) - one of the few tools that actually support monitoring file states (like the original make)
* disttools (https://docs.python.org/2/distutils/) - not actually a "universal" build tool but intended to distribute Python packages
[+] [-] SoftwareMaven|12 years ago|reply
1. http://www.buildout.org/
Documentation can be challenging to find, and it isn't the most actively developed project in the world, but what it does, it does pretty well (including supporting more than python dependencies).
[+] [-] rafekett|12 years ago|reply
computers are fast enough that this doesn't often bother me anymore, but i've run across some huge Rakefiles that could benefit from a rewrite in Make.
[+] [-] chrismonsanto|12 years ago|reply
You might like tup[1]. Its killer feature is that it automatically determines file-based dependencies by tracking reads and writes (using a FUSE filesystem). It has an extreme emphasis on correct, repeatable builds, and is very fast. Other stuff:
- does work in parallel, and will let you know if your build isn't parallel safe. (note it is NOT relying on your specification of dependencies: even if you manually specify dependencies, it will tell you if something's wrong based on what it actually observes your dependencies to be)
- tracks changes to the build script and reruns if the commands change.
- cleans up old output files automatically if build rules are removed.
- lets you maintain multiple build variants (say for different architectures, configurations, etc)
- autogenerates .gitignore files for your build output
- very easy to get started, and "Just Works".
- for advanced usage, it is scriptable in Lua.
I've tried every build system out there. For Unix-y file-based build tasks, tup is, by far, the best. I don't know why it isn't more well known.
[1]: http://gittup.org/tup/index.html
[+] [-] mhw|12 years ago|reply
The huge Rakefiles you've seen could possibly have simply benefited from a rewrite in Rake. Rake has 'file' tasks which implement the file dependencies of 'make' but for some reason most users of Rake seem to ignore them completely.
[+] [-] ChuckMcM|12 years ago|reply
I've given up trying to educate folks, I just make a note to check in with them, 6 months to a year later, to see if they are still building everything.
[+] [-] retrogradeorbit|12 years ago|reply
[+] [-] flohofwoe|12 years ago|reply
- Started using hand-written Makefiles and autoconf. Then someone wants to build on Windows, in Visual Studio nonetheless. Add manually created VStudio project files to the project. Then someone wants to use Xcode, so add manually created Xcode project files. Now you add files, or even need to change a compiler option. Fix the options in the Makefile, open the VisualStudio project, fix the options there, open the project in Xcode, fix the options there. Depending on the project complexity, this can take hours. The next guy needs to build the project in an older VisualStudio version, but the project files are not backward compatible...
- Next step was to create my own "meta-build-system" in TCL (this was around 1999), which takes a simple descriptions of the project (what files to compile into what targets, and the dependencies between target), and creates Makefiles, VStudio-files and Xcode-files, this worked fine until the target project file formats change (happens with every new VisualStudio version).
- Someone then pointed me to cmake which does exactly that but much better (creates Makefiles, VStudio-, Xcode-projects, etc... from a generic description of the sources, targets and their dependencies), and I'm a fairly happy cmake user since then.
- Recently I started to wrap different cmake configuration (combinations of target platforms, build tools/IDE to use, and compile config (Release, Debug, etc...)) under a single python frontend script, since there can be dozens of those cmake configs for one project (target platforms: iOS, Android, OSX, Linux, Windows, emscripten, Google Native Client; build tools: make, ninja, Xcode, VStudio, Eclipse; compile configs: Debug, Release). But the frontend python script only calls cmake with the right options, nothing complicated.
Of course now I'm sorta locked-in to cmake, and setting up a cmake-based build-system can be complex and challenging as well, but the result is an easy to maintain cross-platform build system which also supports IDEs.
I general I'm having a lot less problems compiling cmake-based projects on my OSX and Windows machines then autoconf+Makefile-based projects.
[edit: formatting]
[+] [-] greggman|12 years ago|reply
My own experience is with gyp and ninja which is used by the Chromium team (http://martine.github.io/ninja/) which they use to build Windows, OSX, Linux, Android (and maybe iOS?)
Of course for personal projects I'll probably never notice the speed difference but for bigger ones Ninja is FAST.
[+] [-] asb|12 years ago|reply
I also note that google are working on a successor to GYP, GN which targets Ninja http://code.google.com/p/chromium/wiki/gn.
[+] [-] evmar|12 years ago|reply
[+] [-] luckydude|12 years ago|reply
All I know is for years, decades, I carried around the source to some simplistic make. I hate GNU make, I hate some of the unix makes. I loved the simple make.
The beauty of make is it just spelled out what you needed to do. Every darn time make tried to get clever it just made it worse. It seemed like it would be better and then it was not.
Make is the ultimate less is more. Just use it and be happy.
[+] [-] geuis|12 years ago|reply
1) I have cookies enabled. 2) The Eurpoean law is daft, but since you feel you must comply do it in a more user friendly way.
[+] [-] hhariri|12 years ago|reply
Thanks.
[+] [-] webjprgm|12 years ago|reply
[+] [-] danso|12 years ago|reply
I've always liked how rockpapershotgun.com does it...it also uses a sticky-slim footer, but the text reads: "Rock Paper Shotgun uses cookies. For some reason we are now obliged to notify you of this fact. Not that you care"
[+] [-] msluyter|12 years ago|reply
[+] [-] Cyranix|12 years ago|reply
[+] [-] apples2apples|12 years ago|reply
nix tried to solve much of this, but I agree it can't compete with the bazillion other options.
[+] [-] webjprgm|12 years ago|reply
Such a tool should be: - Zero (or few) dependencies. Likely written in plain C (or C++, D, Rust) and compiled to distribute in binary form. - Cross-platform - Support any mix of project languages and build tasks. - Recognizes standard folder hierarchies for popular projects. - Easy enough to learn. Not overly verbose (looking at you, XML). Similar to Make if possible.
Examples of the auto-discovery: It can find "src", "inc", and "lib" directories then look inside and see .h files then make some educated guesses to build the dependency tree of header and source files (even with mix of C and C++). Or it could see a Rails app and figure out to invoke the right Rake commands, perhaps checking for the presence of an asset pipeline etc. Or a Node.js project. It could check for GIT or SVN and make sure any sub-modules have been checked out.
[+] [-] hooya|12 years ago|reply
Try this next time (only the pertinent lines are included):
I've been using a makefile about 40 lines long and I've never needed to update the makefile as i've added source files. Same makefile (with minor tweaks) works across Erlang, C++, ErlyDTL and other compile-time templates and what have you. Also does automagic dependencies very nicely.> Generating all the targets to get around this is a nightmare that results in unreadable debug messages and horribly unpredictable call paths.
If you think of Makefiles as a series of call paths, you're going to have a bad time. It's a dependency graph. You define rules for going from one node to the next and let Make figure out how to walk the graph.
[+] [-] exogen|12 years ago|reply
[+] [-] daemin|12 years ago|reply
[+] [-] shoo|12 years ago|reply
If you are interested in a comparison of a few interesting build tools, please check out Neil Mitchell's "build system shootout" : https://github.com/ndmitchell/build-shootout . Neil is the author of the `Shake` build system. The shootout compares `Make`, `Ninja`, `Shake`, `tup` and `fabricate`.
Another possibly interesting build tool is `buck`, although it is primarily aimed at java / android development. See http://facebook.github.io/buck/ . There's a little discussion about `gerrit`'s move to `buck` here: http://www.infoq.com/news/2013/10/gerrit-buck .
Here's some questions I'd ask of a build system:
- is it mature?
- which platforms does it support?
- which language ecosystems does it support? (language-agnostic? C/C++? ruby? python? java?)
- does it support parallel builds?
- does it support incremental builds?
- are incremental builds accurate?
- is it primarily file-based?
- how does it decide when build targets are up-to-date, if at all? (e.g. timestamps, md5 hash of content, notification from the operating system)
- does it allow build scripts for different components to be defined across multiple files and handled during the same build?
- does it enforce a particular structure upon your build scripts that makes them more maintainable?
- how does it automatically discover dependencies, if at all? (e.g. parsing source files, asking the compiler, builds instrumented via FUSE/strace)
- how easy is it to debug?
- is it possible to extend in a full-featured programming language?
- does it let you augment the build dependency graph mid-way through execution of a build?
- how simply can it be used with other tools such as your chosen continuous integration server, test framework(s), build artifact caches, etc?
Many of these criteria are completely overkill for trivial build tasks, where you don't really need anything fancy.
[+] [-] sheetjs|12 years ago|reply
[+] [-] ztratar|12 years ago|reply
What's special about the Make community as opposed to the Grunt or Gulp communities?
[+] [-] drawkbox|12 years ago|reply
[+] [-] Xorlev|12 years ago|reply
I'd like to think people have decided that it's easier to replicate the task part of Makefiles onto their environment as the simpler alternative to making dependency management and various other language-specific tasks available to make.
[+] [-] Zelphyr|12 years ago|reply