Messages in this thread |  | | From | "Paul Flinders" <> | Subject | Re: source dependencies cleanup? | Date | Tue, 3 Dec 1996 22:38:06 -0000 |
| |
> From: Peter T. Breuer <ptb@oboe.it.uc3m.es> > a) it isn't 8 seconds worth of fast, which is the mkdep time.
On a 66 Mhz 486 at work "make depend" takes just under 3 minutes. Not a major pain as the kernel takes a while to compile on this machine anyway but I believe (after a quick perusal of the .depend files) that mkdep is fast mainly because it skips stuff.
Now Linus is probably more intelligent than me and certainly knows the kernel a whole lot better so he may have made "safe" short cuts - I don't know.
> > b) it requires you to recompile every touched file every time you make > any update to your system, even if you don't want those files to be > compiled. Call the time to recompile the dependencies as it goes X. > Call the time to recompile the object code Y. So the total is X + Y.
Without doing a lot of work at a sub-file granularity I would be reluctant *not* to re-compile a file which had been touched as part of an update.
Can you suggest a concrete example of when you would touch a file which you don't want to be re-compiled. > > c) because of b) (and a)!), it is a lot slower than a makedep follwed > by a conditional recompilation. Suppose the makedep takes an extra > 20% of Y, but that after the makedep I only have to actually recompile > 50% of my files. Then the time to recompile is > 0.2*Y + 0.5*Y = 0.7*Y > > This cannot be worse that X+Y!! You would only have a chance of winning > out if I had to recompile 80% of my files after an upgrade, which is > not the case. > > In any case, what really happens is that I run mkdep, which takes 0s > effectively, and then get a slightly worse approximation to the files > taht I need to recompile. Say I have to recompile 60% instead of 50%. > Then the total time used is > 0 + 0.6*Y = 0.6*Y
The amout of extra time taken by -MD is *very* small. I don't see why any files would be re-compiled without needing to be modulo that fact that the current scheme may avoid placing commonly changed files (eg autoconf.h) in the dependancies because most of the kernel includes them and any edit causes the whole kernel to be re-compiled. However IMO it is dangerous to omit dependancy information like this and the correct fix is to split the config #defines into several files.
Using -MD kives you an accurate picture of the dependancies which is always up-to-date. mkdep appears to give a partial picture (it doesn't on brief examination appear to output dependancies for nested files). the dependancies generated by mkdep can also become out of date (as soon as you add a new header to a source file)
mkdep also ignores #if/#endif which means that I might end up re-compiling a file un-necessarily because I edit a header which a source file includes in *some* circumstances but not in the current configuration.
In my experience a) "make depend" is still a noticable addition to the compilation time (although some of this is to do with modules which may still need to be done & making sure that happens appropriately will need though) b) I can forget to do it. c) When I really want it to re-compile a small subset of files after a config change I get half of the kernel re-compiled anyway.
I think that using -MD would eliminate the non-module part of a) and eliminate b). It shouldn't affect the "robustness" of the dependancy tracking (in fact it will probably improve it).
|  |