Hacker Newsnew | past | comments | ask | show | jobs | submit | chiumichael's commentslogin

It's much better than YouCompleteMe imo. The context-awareness of it is quite amazing.


What are some things that autotools can do that more modern tools can't?


First thing that comes to my mind is the best-in-class cross-UNIX compatibility. But shouldn't the question be the other way around: What can the modern tools do that autotools can't?


The obvious: CMake compiles into either Makefiles or Visual Studio solutions, allowing for cross Linux / Windows builds.

Support for Ninja files also seems to speed up builds in my experience. (Ninja is a Makefile "assembly language", minimal features, assuming that things like CMake files compile into Ninja). Ninja is fully multithreaded and focused on speed instead of features, and it really does show.

Since ./configure is single-threaded, a substantial amount of time needs to be spent whenever you make changes to your Autoconf / Automake stuff.

CMake has a few nice things: a built in basic testing framework (just interpreting the exit codes of programs: 0 is success and anything else is test-failure). Said testing framework has been integrated into various other test frameworks (like Googletest: https://cmake.org/cmake/help/latest/module/GoogleTest.html).


Autotools actually has some built-in test framework support too. A Makefile.am can have one or more programs marked as TEST, which will run when `make check` is executed. Depending on how you set it up, Automake knows how to use the TAP protocol to collect the pass/fail from your tests.

You can also run `make distcheck` to generate a source tarball, extract/build the tarball in isolation, and run `make check` on the result. A very handy way to make sure that you've packaged everything correctly, and everything builds/tests OK when executed out-of-tree.


To those who downvoted my parent comment, what was your reasoning? Not trying to troll - I was just trying to explain a useful Autotools feature in a discussion about Autotools and CMake features.


> The obvious: CMake compiles into either Makefiles or Visual Studio solutions, allowing for cross Linux / Windows builds.

I get that this is especially attractive for some companies earning money with Windows. But how important are contributions from developers on Windows for Linux distributions and the free software community? I think that most companies which use open source libraries on Windows are unlikely to give anything back.

Also, when it comes to things like networking and server software, Windows has many important differences, and is not that relevant at all any more. Well, every project will need to figure out on its own whether supporting Windows builds is worth the time and hassle.


And good integration with conan. Dependency management is nice. Vcpkg is getting there too.


The problem with conan is that it is yet another language-specific package manager. This is especially fatal since the importance of pure C and C++ projects is shrinking in proportion, and both are more and more used in cross-language projects. And for the latter, I think a Linux distribution or a system like GNU Guix is much better suited.


Distro managers lag and are fragmented. Guix looks good. Neither support windows though If you have that need. Trying to use conan in a blended c++/rust code base hasn't been that bad, but that is because we weren't using any crate outside stdlib.


Good point. Personally, I've avoided autotools for a long time since many projects use CMake and because of its reputation as a painful thing to learn. Lately, however, I've started learning more about autotools, so was just curious as to what it's really good at.


I am absolutely the opposite. I've avoided CMake for a long time since many projects use autotools and because of its reputation as a painful thing to learn. ;)

But actually, one of the points made in the Calcote book really stuck with me: Build tools are not just about developer convenience; they're also about the user's convenience. It's more work on my end, but my users (who, admittedly, are on UNIX-likes) know to type:

  ./configure
  make
From their standpoint, it just works as expected.


Yeah, I find it really irritating when a build system doesn’t respect this interface. Really, cargo and all these other tools should have options for generating a configure script and makefile: they could be relatively minimal and just invoke cargo with the appropriate options, but it would make it much easier to build arbitrary projects from source.


This was a near magical revelation for me. I should finish that book.

It is astounding how many tools I have to install to get most tools to work. :(


It's fascinating, isn't it? Right! Users . . . :)


> From their standpoint, it just works as expected.

This is also quite important when you happen to work on larger projects with many dozens or hundreds of modules and you need to build a specific part weeks after you have been looking at them. To have to look up every time how to build something is more than a nuisance.


I never understood why there are typically two commands and not just “make”. Is this a historical accident?


I think it is simply that make is too crufty to extend. It has perma-broken behavior around space handling, which nobody dares touch [1]. But you can't replace it because it is entrenched, so Makefiles must be wrapped, by configure or CMake or whatever. And these in turn have their own cruft, so must be wrapped, by autoconf and...well someone please wrap CMake!

The C and C++ ecosystem is crying out for a new build system, but there is no mechanism to converge on one. It's bad.

1: http://savannah.gnu.org/bugs/?712


Make does what it does surprisingly well: It runs commands on files based on a description of dependencies. There is no shortage of would-be successors of make, but so far none of them has succeeded, which supports the hypothesis that "make" has hit a sweet spot.

There is one alternative to make which I think is worth mentioning, because of its simplicity, brilliance, and excellent support for correctness: it is redo, as in Apenwarrs redo:

https://redo.readthedocs.io/en/latest/

And redo works with autotools!


This is by construction and is nicely explained by the slides I linked before.

Basically, autoconf generates a makefile which will work for your architecture. It does that via a script which is called "configure", which generates that makefile. The configure script is independent of your hardware and platform and does only require a standard POSIX shell, and it generates code for the version of "make" that is present on your platform. This means that in difference to other build systems, one does not have to install (and build!) the build system for the own platform, and also that there are no compatibility problems because the configure script always matches the source code distribution.

On top of that, the "configure" script is also automatically generated, typically using two files "config.ac" and "makefile.am", via a command which is now called "autoreconf", but this normally happens only on the developer's system.


Oh yes, the reputation of the learning curve precedes it. But the book helped me get started and, as is often the case with complex things, once you've dealt with it a bit, the experience wasn't as bad as I feared.


- it has absolute minimum requirements for running on a target system - it requires only a POSIX shell

- the macro language it uses is strongly standardized, so it produces identical semantics on all platforms. And because it uses macros, which are expanded into shell code, it does not require the macro interpreter to be present on the target systems neither, only the generated script.

- the configure script is included in the source distribution and as such "frozen", so one has no problem with changes or incompatible upgrades which could break builds.

- the philosophy of autoconf is based on feature testing, for example if a given compiler version produces code with a certain property, if a library with the name "foo" contains a function with the name "bar", and so on. This is the right way. It is also good for providing upgrades between large libraries, like Qt.

- testing for features individually is the only way to escape a nuclear-scale combinatorial explosion of feature combinations.

- feature testing also allows for support of evolving software, new functions that are buggy first and need to be worked around, then gradually fixed and turning perhaps into a special library, then perhaps an element in Boost, later you need perhaps a specific version of boost, and so on. This is well-suited to the bazaar-style development which is characteristic for Unix, with an open interchange and borrowing of ideas.

- as already mentioned, it is very well documented. People who just copy-paste existing code are doing it wrong.

- it is understandable, without too much magic build into it. cmake is affected by an excess of magic IMO - it is hard to understand what is going on, and as a consequence, it is hard to fix a failing script. In my experience, a fraction of the time was needed to come up to speed in comparison to learning only some of cmake.

- This also means that it is easier to maintain for a medium-to-long-term future. It is nuts to use a language which is constantly changing precisely for infrastructure- some poor people will have to read and maintain all that code years later! (And, this people could even be you!).

- You only need to support platforms and features you need, at the same time you are able to support a myriad of exotic platforms if you want it. That means that if you want your software to work only on Ubuntu 64 bit Linux, you only need a very small configure.ac script, but if you want to support older libraries on Debian, you can do that easily as well, and if you need to support an exotic architecture with 36 bit-words or a 16-bit embedded MCU, you can do that as well.

- It is so widely used so that almost every Unix developer knows how to build software with it, without much thinking.

- It tries to do one thing, and do it well, providing platform-specific build configuration. It does not try to do dependency management or package management, leaving that to specific package managers.

- and because of the previous point, it also works well with new distributions and package managers, like Arch or NixOS or GNU Guix - it does not interfere with them, and does not fight for the position to be the top dog in a packaging system.

- Both autoconf and make are, while supporting C and C++, not language-specific, so you can easily use it to distribute things like LaTeX documentation, pictures, or even fonts.

- you can use and generate a mix of shared and static libraries, and also use both pkg-conf as well as standard paths (or, for example, paths provided by GNU Guix). And it can use that for each package individually. In contrast, cmake find_package commands do have difficulties with mixing different retrieval mechanisms.

- Supporting all kinds of hardware and compilers will continue to be important. While there are without question obsolete platforms, it will be able to support different hardware architectures in the future, like CUDA on GPUs, or ARM or RISC architectures.

- It works for any library without requiring specific upstream code. This is in stark contrast to cmake, which often needs supporting find_package commands or target definitions for specific libraries. The latter means ultimately that cmake would need to become a meta-distribution which requires specific support for many packages one wants to use. I think that as the complexity of software continues to grow, this approach will probably not scale.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: