Misses the fundamental point that Make is broken for so many things. To begin with you have to have a single target for each file produced. Generating all the targets to get around this is a nightmare that results in unreadable debug messages and horribly unpredictable call paths.
nix tried to solve much of this, but I agree it can't compete with the bazillion other options.
It does not miss it, just ignores it. The author states that there are lots of things we can improve but the point is that we have too many variations on the theme without converging to a solution that has few (or no) dependencies and comes with built-in build knowledge and the ability to discover what you want rather than make you declare it.
Such a tool should be:
- Zero (or few) dependencies. Likely written in plain C (or C++, D, Rust) and compiled to distribute in binary form.
- Cross-platform
- Support any mix of project languages and build tasks.
- Recognizes standard folder hierarchies for popular projects.
- Easy enough to learn. Not overly verbose (looking at you, XML). Similar to Make if possible.
Examples of the auto-discovery: It can find "src", "inc", and "lib" directories then look inside and see .h files then make some educated guesses to build the dependency tree of header and source files (even with mix of C and C++). Or it could see a Rails app and figure out to invoke the right Rake commands, perhaps checking for the presence of an asset pipeline etc. Or a Node.js project. It could check for GIT or SVN and make sure any sub-modules have been checked out.
The dependencies thing is a killer. I remember a Windows developer co-worker insisting that everyone had the .NET runtime installed, and after shipping it turned out that most of our customers didn't have it installed, to which he finally said, "well, I always have it installed." (To be fair, I should have pressed him harder, and I did ask the question twice, but because I'd never built against the runtime I was unprepared for any challenge.)
Almost every new project I download starts with a sad, manual, and demoralizing installation of a bunch of third-party stuff that you have to google to find out what's missing. And it's not educational at all, because in a few years all these tools will now be obsolete.
(The best project I ever encountered was the Stripe CTF, which almost always used just one command to install a complete working copy of everything you needed and didn't have. I'm still impressed with that.)
Some of these requirements should be built into any build tool. However, most can be added easily enough:
For instance, redux [https://github.com/gyepisam/redux] is written in Go (not compiled for binary distribution, but I could add that), is cross platform, supports any mix of languages and tasks, is very easy to learn.
It uses shell scripts to create targets so everything is scriptable.
Stuff like recognizing standard folder hierarchies and auto-discovery can be added with small scripts or tools.
It can be as simple as you want or as complex as you need.
> To begin with you have to have a single target for each file produced.
Try this next time (only the pertinent lines are included):
SOURCES=$(wildcard $(SRCDIR)/*.erl)
OBJECTS=$(addprefix $(OBJDIR)/, $(notdir $(SOURCES:.erl=.beam)))
DEPS = $(addprefix $(DEPDIR)/, $(notdir $(SOURCES:.erl=.Pbeam))) $(addprefix $(DEPDIR)/, $(notdir $(TEMPLATES:.dtl=.Pbeam)))
-include $(DEPS)
# define a suffix rule for .erl -> .beam
$(OBJDIR)/%.beam: $(SRCDIR)/%.erl | $(OBJDIR)
$(ERLC) $(ERLCFLAGS) -o $(OBJDIR) $<
#see this: http://www.gnu.org/software/make/manual/html_node/Pattern-Match.html
$(DEPDIR)/%.Pbeam: $(SRCDIR)/%.erl | $(DEPDIR)
$(ERLC) -MF $@ -MT $(OBJDIR)/$*.beam $(ERLCFLAGS) $<
#the | pipe operator, defining an order only prerequisite. Meaning
#that the $(OBJDIR) target should be existent (instead of more recent)
#in order to build the current target
$(OBJECTS): | $(OBJDIR)
$(OBJDIR):
test -d $(OBJDIR) || mkdir $(OBJDIR)
$(DEPDIR):
test -d $(DEPDIR) || mkdir $(DEPDIR)
I've been using a makefile about 40 lines long and I've never needed to update the makefile as i've added source files. Same makefile (with minor tweaks) works across Erlang, C++, ErlyDTL and other compile-time templates and what have you. Also does automagic dependencies very nicely.
> Generating all the targets to get around this is a nightmare that results in unreadable debug messages and horribly unpredictable call paths.
If you think of Makefiles as a series of call paths, you're going to have a bad time. It's a dependency graph. You define rules for going from one node to the next and let Make figure out how to walk the graph.
Could you post an example of what you mean by the single target/file limitation? As stated I can't tell how implicit rules or a rule to build an entire directory wouldn't be a solution, but maybe I'm not understanding the problem.
Sure, consider a compiler that produces an (foo.o) object file and an annotation (foo.a). Now if a target requires both foo.o and foo.a you have to create two targets on them (even though its really one command).
You can do implicit rules which requires a very verbose makefile, which is what automake and other make generation tools do. God help you figure out what went wrong.
If you make people go to a directory approach you've now imposed a new structure on their code. One reason for the multitude of packages is each one matches their target community better.
The third rule simulates a compiler producing two outputs. Now if foo.o changes, both "copied" and "o" will be updated, and if foo.a changes, both "copied" and "a" will be updated. (And if either foo.o or foo.a are deleted, the compiler will be rerun, as will everything depending on foo.a or foo.o.)
If both the .o and the .a are created from another file, wouldn't it be safe to just rely on either one of them? (Obviously, you will need to be consistent in choice.)
That is, if every time a .o is created, so is the .a, then where is the difficulty? Just rely on one (the .o). I could conceive of a scenario where the .a updated but the .o didn't, but I don't know of any tools that really work that way right now. I thought the norm was to at least touch all output files.
Further, if that is happening, seems you are safest having two rules, anyway.
Say you have a long build process and do a quick semi-clean by hand to speed up the next buld (not the best idea, but not inconceivable), deleting the .a files, but fogetting to delete the matching .o files. Then, your next build will produce some novel (to you) error messages that may take long to clean up. Worse, the command building on the .o and the .a might just say "OK, I'm given a .o without a .a; fine, then I'll do a slightly different thing"
Also, having two rules means duplicating a command:
Invoking the command twice can also screw up things if you run parallel build, which you should always do! Not only to speed things up it's also a good way to verify that your make file actually is correct. If your make file doesn't work in parallel build it is broken, in the same way as C code that breaks at -O2 and above due to reliance on undefined behavior.
The solution to the multiple target problem is using the built in magic .INTERMEDIATE rule which isn't entirely obvious how it works.
Ok, that makes sense. I'm tempted to rattle the knee jerk, "don't go deleting random crap," but I realize that is a hollow response.
I'm curious how .INTERMEDIATE helps in this case. I did find this link[1], which was a rather fun read down how one might go about solving this, along with all of the inherent problems.
The target baz has both a .l and a .o, both of which are produced in one command. The line that begins with "%.o" starts an implicit rule, which loosely states, in English: "to produce a .o file, or a .l file, run the following ...". $(*F) is a GNUism that maps to the filename of the source (directory part, if any, is stripped). This works. I tested all three targets (foo, bar, baz) with a "make clean" between each one.
(and for the really curious, a09 is a 6809 assembler; disasm.a is a 6809 diassembler, written in 6809; binary is a 2K relocatable library)
Or if you don't like taeric's suggestion you can just touch a .ao file after the line that creates the .a and .o files and have your further rule(s) depend on that .ao file. Have .ao depend on your source. If you still want to be able to type stuff like 'make foo.a' instead of 'make foo.ao' and have it work, then you can make a rule where .a depends on .ao and all the rule does is touch the .a file. Create the same rule for the .o too.
nix tried to solve much of this, but I agree it can't compete with the bazillion other options.