There are several wrong things with that post. First of the PKG_CHECK_MODULES macro checks that pkg-config is present on the system and presents a nice error message if it isn't. Second, PKG_CHECK_MODULES can be used to specify several dependencies:
PKG_CHECK_MODULES(DEP, foo udev >= 999 neverhere)
Then if one or more of the dependencies aren't met, all problems are shown:
Package foo was not found in the pkg-config search path.
Perhaps you should add the directory containing `foo.pc'
to the PKG_CONFIG_PATH environment variable
No package 'foo' found
Requested 'udev >= 999' but version of udev is 175
Package neverhere was not found in the pkg-config search path.
Perhaps you should add the directory containing `neverhere.pc'
to the PKG_CONFIG_PATH environment variable
No package 'neverhere' found
It may not seem like much, but for someone not used to compiling stuff from source it means a lot that all missing dependencies are shown at once. A sloppy shell script to check for the same dependencies would mean that the user would first see "foo not found" and go and download that package, then "udev >= 175 required" and when that is fixed the third error and so on.
Also, the error message "You can download mono from www.go.mono.com" isn't very useful for most Linux users. The correct remedy on distros that ship with devel packages is to figure out the name of the package that supplies the include headers and pkg-config file and istall that. pkg-configs error message aren't that great either, but at least they are easily recognizable.
In my opinion, autoconf is a very misguided build system. It is designed so that it should have no other requirements on the compiling users system other than a POSIX environment, a shell and a C compiler. It is a completely unnecessary constraint because nowadays everyone is able to install great languages like Python, Ruby, Scheme and many others much more suitable for writing build configuration systems than shell script. There is no point anymore in using the M4 language to generate a 1000 line configure script when a 50 line Python/Ruby/$OtherNiceScriptingLanguage would work just as well.
> It is a completely unnecessary constraint because nowadays everyone is able to install great languages like Python, Ruby, Scheme and many others much more suitable for writing build configuration systems than shell script. There is no point anymore in using the M4 language to generate a 1000 line configure script when a 50 line Python/Ruby/$OtherNiceScriptingLanguage would work just as well.
First, autoconf isn't a build system. It's just a 'configure' script generator.
It's nice that the configuration script runs anywhere you can find a POSIX environment. It's not actually that hard, either; it mostly just runs commands and checks the result code. The bulk of the knowledge stored in autoconf is the knowledge of how to write those tests, which tests are necessary, and what flags you need on various platforms.
I think the big problem here is not that the configure script is a portable shell script, but rather, that it's generated by an M4 script. Don't forget that Python itself has a configure script, and writing the configure script in Python would create a bootstrapping problem (it would add Python to the list of programs that need to be cross-compiled on new systems).
So Miguel objects to any use of third-party m4 macros at all?
Because these are the possibilities:
1. You install from your package manager
2. You install from a source tarball, which contains a configure script
3. You install the bleeding edge version from a development repository
4. You do 3. but somehow you managed to break your autotools install (and I don't really want to know how you did), or you're running a long-eoled distro that ships an ancient version of pkg-config that doesn't include the pkg.m4 macro.
The fourth scenario can cause macro trouble, but it takes a sophisticated kind of user to install from development repositories rather than a better-supported option (not to mention breaking autotools), so I think it's on those users to troubleshoot and fetch the missing macro.
The "simple" code completely breaks with cross-compilation. This is because the autoconf macros will first look for the CHOST-pkg-config program (e.g. armv6j-hardfloat-linux-gnueabi-pkg-config), which will normally be a wrapper around pkg-config specifying PKG_CONFIG_SYSROOT_DIR to make pkg-config look for stuff in the target prefix instead of the host. For example, Gentoo's "crossdev" tool automatically generates such a wrapper.
So if we all just start "simplifying" the code like that, we'll end up with lots of cross-compile breakages for packages which cross compiled all right before, and I suppose many people will find the best fix to just revert to using the autoconf macros again, as opposed to hardcoding the procedure for finding the right pkg-config.
The instinct to take a repetitive pattern and replace it a higher-level abstraction is a good one. It is only because we do this that software is comprehensible at all.
If a person's attempt at doing this leads to terrible problems in practice, problems that have nothing to do with the difficulty of the problem itself but rather with configuration or peculiarities of the implementation, then that is a sign that your entire system is built on a mountain of sand.
His conclusion is that "pkg.m4 is a poison that is holding us back." No. The poison is a pile of m4, shell, and autotools that are held together with bubble gum and chicken wire.
I came to the conclusion long ago that the world is ready for an autoconf replacement (note I did not say "new build system" or "replacement for make"). It's too arcane, and it's long since grown too large for M4.
Projects like CMake, SCons, and Jam try to obviate the need to write make files, but they treat the configuration step as an afterthought.
One thing that improves reliability is in fact to get rid of the configuration step. Not just the workarounds for broken shells or old Solaris/HP-UX versions with anemic libcs, but the principle of disabling some features or switching to alternative implementations when something isn't found. It just creates new code paths and combinatorial complexity, probably broken because the developer's environment will never trigger them. Distro packagers already prefer hard fails to know when they are missing a dependency.
To compensate, the replacement just needs to offer better facilities for triggering installs of the missing dependencies. That's basically the path taken by modern languages (Go, Python, Ruby, Node… gnome's jh_build is part of the same trend): they have tools to fetch source dependencies from the network, to install in custom prefixes that don't require root, to create isolated prefixes so that multiple versions can be installed in parallel, to freeze known-good versions of the dependency graph.
But those little knobs so you can build your own customised version with decals and your own red spoiler? No one will miss them.
Yes, there are lots of little knobs used by those wacky Gentoo folks, and those dumb old shells on Unix of decades past. My ideal replacement would get rid of those checks and assume you have a working POSIX shell and C89.
However, there are also optional dependencies. SoX can use FFTW to make spectrograms, but it's optional. You can use x264 from ffmpeg but it's not necessary. Emacs will build with GTK 2 support, GTK 3 support, or neither; and it will use whatever PNG/JPEG/XML/etc libraries happen to be installed but doesn't care if they're not present. GCC will use Graphite whet available. A few programs will happily play audio through libao or Core Audio or ALSA. There's combinatorial complexity there in theory, but it's not hard to isolate optional dependencies in practice, most of the time.
And suppose you want to use some feature available on new Linux kernels. It's just easier to write an autoconf test rather than try to come up with some convoluted preprocessor test.
There are also important knobs to tune: look at LibGMP, ATLAS, or FFTW for examples. Old PowerPC Macs could run 64-bit code if you asked but ran 32-bit by default, so you'd compile GMP for 64-bit and get a huge performance boost.
And some options are only useful to developers, like enabling -Werror (which should never be enabled by default in a source release). The configuration step bakes these options into the makefile so you can still just type "make".
You have a good point about hard failing when dependencies aren't present, and the configuration step allows you to fail early and provide an intelligent error message.
I'm not convinced that eliminating a configuration step improves reliability. Just think of how dirt-common autoconf scripts are, in spite of how much of a pain autoconf is to learn.
A short set of config options works well for most things though, rather than autodetect. After all most people will be packaging and need to control exact dependencies. I dont mind reading through a config file to tweak these settings.
A lot of those knobs are actually pretty useful, especially when there are competing implementations of a standardized interface that have different strengths. As an example: In scientific computing this often means choosing a particular BLAS or MPI implementation when compiling, sometimes based on the data in a particular run.
Not saying autoconf is awesome (I curse it on a regular basis), but the configure step itself is worth keeping around.
I am of the opinion that the whole autohell toolset has outlived its usefulness. It was important back in the day where when there were a million different UNIX systems. But today, we have something like 5-6 left and most of those are doing the majority of POSIX right.
It is kind of odd we still are doing this today. It is somewhat of a mistake.
Also, the error message "You can download mono from www.go.mono.com" isn't very useful for most Linux users. The correct remedy on distros that ship with devel packages is to figure out the name of the package that supplies the include headers and pkg-config file and istall that. pkg-configs error message aren't that great either, but at least they are easily recognizable.
In my opinion, autoconf is a very misguided build system. It is designed so that it should have no other requirements on the compiling users system other than a POSIX environment, a shell and a C compiler. It is a completely unnecessary constraint because nowadays everyone is able to install great languages like Python, Ruby, Scheme and many others much more suitable for writing build configuration systems than shell script. There is no point anymore in using the M4 language to generate a 1000 line configure script when a 50 line Python/Ruby/$OtherNiceScriptingLanguage would work just as well.