[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
include has quadratic complexity
From: |
Jed Brown |
Subject: |
include has quadratic complexity |
Date: |
Sun, 19 May 2013 14:13:53 -0500 |
User-agent: |
Notmuch/0.15.2+84~gfa8aadf (http://notmuchmail.org) Emacs/24.3.1 (x86_64-unknown-linux-gnu) |
When using the compiler to generate dependencies via -MMD or similar, we
end up with a large number of *.d files to include. Unfortunately, the
include directive scales quadratically in the number of files to include
(whether or not the files exist).
$ cat > Makefile <<EOF
all:
-include $(shell seq 1 $N)
EOF
make-3.82:
$ time make -q N=2000
0.623 real 0.610 user 0.010 sys 99.50 cpu
$ time make -q N=4000
2.336 real 2.313 user 0.020 sys 99.87 cpu
$ time make -q N=8000
9.407 real 9.300 user 0.083 sys 99.74 cpu
$ time make -q N=16000
43.844 real 43.753 user 0.090 sys 99.99 cpu
The latest version 3.99.90-5-g8018345 does better, but is still
ultimately quadratic:
$ time make-git -q N=2000
0.330 real 0.317 user 0.010 sys 98.87 cpu
$ time make-git -q N=4000
0.785 real 0.730 user 0.050 sys 99.31 cpu
$ time make-git -q N=8000
1.843 real 1.797 user 0.043 sys 99.85 cpu
$ time make-git -q N=16000
6.141 real 6.060 user 0.077 sys 99.93 cpu
$ time make-git -q N=32000
24.187 real 23.860 user 0.293 sys 99.86 cpu
When I have real *.d files, concatenating those files into a single
included makefile speeds up a do-nothing build by an order of magnitude.
Similarly, we are a couple orders of magnitude away from being limited
by file system performance to open the files.
$ time perl -e 'for (my $i=0; $i<1000000; $i++) { open(my $f, "<", "$i"); }'
4.601 real 1.853 user 2.727 sys 99.55 cpu
Is it feasible to make 'include' have quasi-linear complexity in the
number of files?
- include has quadratic complexity,
Jed Brown <=