Make: A Critical Retrospective

In the beginning there was Make ...

Once upon a time there was a kingdom whose subjects lived lives of great toil and want. The King saw their plight and summoned the finest minds of the kingdom, issuing the following challenge.

“The one who is able and noble enough to create the tool which can aid our people in their hardship will winneth their hearts and minds. You have the kings word, and his word is his bond.”

The sages gathered together. After many long days and nights, eventually one amongst them stepped forward. Lo, a new tool was created and it was to be called ‘Make’. The King's plea had been answered.

Ever since then many developers have spent unholy amounts of time trying to patch up subtle yet trivial bugs in their software builds. Unfortunately, due to ambiguity in the King's oath, he had accidentally consigned his people into a generation of servitude and further hardship.

All characters and events appearing here are fictitious. Any resemblance to reality is purely coincidental.

In Make, complex dependency relationships between files and various parts of the tool chain can be described in a declarative<a href="#footnotes" />1</a>, flexible and general way. Make is quite powerful and expressive, but in retrospect its design has caused a lot of problems — particularly with maintenance.

Although it is a general tool, the principal task Make was created for was building C on Unix systems and its design and its limitations reflect this, along with the era in which it was created. The low hanging criticisms are issues which we can say categorically would/should not exist in a modern tool:

  • the infamous tabs/significant white space issue
  • the preference for short cryptic codes over readable names<a href="#footnotes" />2</a>
  • inability to modularise cleanly<a href="#footnotes" />3</a>
  • many implementations, but basic features not standardised (e.g. includes done differently in different implementations)

Another issue is that Make is notoriously difficult to debug. This can at least partly be attributed to its declarative nature. The lack of an explicit control flow means that no debugger was ever developed and techniques such as tracing cannot be applied to the full extent we are accustomed to.

Fundamental Issues

However, Make has a couple of more fundamental problems.

Firstly, the tight integration with the Unix Shell becomes a big problem both as complexity increases and time elapses. It is certainly convenient to access the Shell environment for smaller, more transient builds, but in larger builds and in the longer run it leads to a number of problems. The Shell was and always has been a compromised environment for proper engineering, because it has been optimised for short invocations and interactive sessions. A design point with a heavy trade off against being a sound engineering environment.

Depending upon the Shell has made Make difficult to port<a href="#footnotes" />4</a> - even to different Unices or, worse, in some cases even to different machines<a href="#footnotes" />5</a>. Ideally the system specific parts would be narrow and isolated so that they can be swapped out when porting a build to a different system. This is not the case. In practice subtle implementation differences introduce a lot of system specificity. This situation resulted in the Autoconf tool which handle the differences across systems by generating system specific Make files to make builds portable. However the sheer arcaneness of the Autoconf is beyond notorious<a href="#footnotes" />6</a>.

In Make there is no mechanism for locating external dependencies and the common practice of just using the file system is not really adequate. In the context of Unix it makes some sense since dependency libraries (and if necessary their header files) install themselves centrally, but this approach only makes any sense for stable APIs and further reduces build portability. In more modern build tools we see active dependency management where the build tool is responsible for fetching a dependency given just its name and version, which is much more satisfactory and portable solution.

The other fundamental problem with Make is the fact that it is file based; All reasoning within Make is done at the level of the file. Of course file based reasoning actually made a lot of sense in early C/Unix programming, where the following factors made the fine grained control Make offers useful<a href="#footnotes" />7</a>:

  • The C tool chain is file based
  • Projects often consisted of fewer files, organised with less convention
  • The actual process of building a program could be quite involved and with plenty of occasion for trickiness (e.g. compiling/linking different files with different switches, avoiding namespace clashes, dealing with arbitrary length limits to filenames and method name, the general arcaneness of the tool chain ... etc.)

These days development with modern languages (and even C to some degree) has moved beyond worrying about the treatment of individual files (it is much more common to talk about source directories), a situation which no longer plays to the strengths of Make.


So Make solved the build problem of its time, but ultimately the problem was one of complication and not complexity <a href="#footnotes" />8</a>. In the longer term, the real answer has been firstly to create better, saner toolchains<a href="#footnotes" />9</a> and secondly to decide upon conventions for file layout in software projects. Using a convention to structure files in a project is an example of what, in programming, is known as making the data structure do the work. The structure simplifies the usage of the tool chain and has the added benefit of making projects that bit more comprehensible to other developers.

In a world of increasingly structured and large scale development, the once ubiquitous Make has become an anachronism and a legacy technology.


<a name="footnotes" />

  1. To its credit, Make was declarative before it was fashionable to be so.
  2. Especially costly now that maintenance is increasingly performed by those with only hazy memories and the uninitiated.
  3. See Recursive Make Considered Harmful, a 15 page academic paper on why the most obvious way to modularise Make is broken.
  4. Porting a Make build to windows essentially requires using a clone of the Unix environment (such as MingW), and then some.
  5. Tight shell integration has fuelled the phenonemon of the 'magic' build machine — where a particular machine becomes the only place where a build can be performed, and as a consequence absolutely critical to an organisation (not in a good way).
  6. Not to say that Autoconf is not clever, rather it is arcane due to the challenge of overcoming limitations of Make (which as this article goes on to argue) is itself challenged with overcoming the limitations of the the C tool chain.
  7. Incidentally, Make being file based is not necessary in order to support incremental builds. The same can be achieved in any build tool by comparing the modification date of the inputs and outputs before executing a build step. The net effect is the same in terms of file system operations - all dependency files are checked and only files with newer dependencies (transitively) are regenerated.
  8. It is useful to maintain a distinction between complexity and complication. Complexity has connotations of innateness, where as complication implies a (human) cause.
  9. This trend can be seen in Java, a more recent language than C, where the Java compiler operates on directories and not individual files - any limitations Java may have, compared to C, are not due to this!