Messages in this thread | | | Subject | Re: stat benchmark | From | Alexander Larsson <> | Date | Mon, 28 Apr 2008 21:41:36 +0200 |
| |
On Mon, 2008-04-28 at 02:13 +0200, Carl Henrik Lunde wrote: > On Mon, Apr 28, 2008 at 1:29 AM, Soeren Sandmann <sandmann@daimi.au.dk> wrote: > [...] > > For a directory of ~2360 files, chunks of a 1000 files is actually > > surprisingly worse than statting all of the files at once: > > > > Time to stat 1000 files: 1.008735 s > > Time to stat 1000 files: 0.738936 s > > Time to stat 366 files: 0.217002 s > > > > I guess this just shows that seeks really is pretty much all that > > matters. Glib should maybe use a larger chunk size. > > I agree, if I remember correctly I did not find a directory on my local > disk where the best result was to sort a chunk instead of the complete > directory.
I don't think that is expected either. The reason for the chunking is to avoid unlimited memory use on large directories.
| |