Overview | Releases | Download | Docs | Links | Help | RecentChanges

ParallelBuildIdeas

Things we might like to be able to do with a unified build system. (In what follows, CalibrationController is taken as an example of an arbitrary package.)

(1) Definitely need this:

cd $SCT_DAQ_ROOT
make                      # to build everything -- should be easy

(2) The whole exercise is a waste of time if we are unable to make maximum use of the "keep going -k" and "jobs -j" options ... i.e. we would like to be able to profitably do:

cd $SCT_DAQ_ROOT
make -j 4 -k
which will come for free if we transit to the global single makefile without recursive make.

(3) It would be very handy to be able to do this

cd CalibrationController
make                      # to build CalibrationController things ? Tricky?

If (3) is possible, then the following will follow automatically:

cd $SCT_DAQ_ROOT
make -C CalibrationController

(4) Whether or not (3) is possible, the following will be possible (and easier to implement) and would be what we need as a bare minimum:

cd $SCT_DAQ_ROOT
make CalibrationController_TARGETS
which would make all targets specified by a MakeVariable? called CalibrationController_TARGETS which could be defined somewhere inside the Makefile.

(5) We definitely need equivalent versions of existing targets like "clean" and "distclean" which presumably would have to be implemented by doing soemthing along the lines of appending commands (eg things to be deleted) in each make fragment to a slowly growing make-variable which could be expanded on a request to "make clean".

Something else which needs to be addressed

The present system of copying files around (in particular to the "include" area for headers but also the "install" area for libraries) has implications for parallel builds. In particular, where there is more than one copy of a header, and where the "-I" arguments to the compiler make it possible that the header is potentially includable from BOTH the include area AND the local directory (say) then it is evident that we can get unsable builds ... or "double" builds ... where for example in the first pass (when the include area hasn't been populated) the local header is used, whereas in second pass build (after the include area has been populated) the include area header may be used etc. This is particularly a problem when it relates to autogenerated headers which get generated locally and then installed, as then UNLESS THE MAKEFILE IS WRITTEN CAREFULLY ENOUGH, it may not be obvious that the remote copy of the generated header depends on the local copy of the genertated header which depends on the local copy of the idl file, etc. This can be worse if even the idl files are "installed" with duplicates. My feeling ( Chris ) is that the best way would be to rationalise the names of our directories along the lines that CMT uses: Package/Package/ExportedHeader?.h so that we could do away with "installation" of headers altogether and just use direct includion from $SCT_ROD_DAQ root or via symbolic links of Package/Package into a link-based include area.

Separate build tree?

This is not necessarily related to the issue of parallel builds, but it might be relevant to consider at the same time.

Do we want to be able to build the software in a separate tree?

I think this is related? It may be useful to be able to compile, for instance, slc3 and slc4 files from the same source tree. This requires putting the generated binaries into a local build directory (eg named after CMTCONFIG). Is the VPATH variable in make useful, or does it go the wrong way.