Hi
I'm using "sort" to, surprisingly, sort huge files (up to 2 GB). Of
course I had to use the -S switch to tune memory usage. My concern
is that I
don't have enough memory to sort in RAM, thus sort is using
temporary files tat it merges to get the final result.
Having a look to sort's sources I noticed that NMERGE is hardocoded
(it's not even an option).
My questions are :
- I did a patch to make sort accept a new option (currently called
-N to setup NMERGE). Is someone interested by this patch ?
- I'm doing some benches, but I already noticed that 16 (the
hardcoded value) is not always the best one. For example my first
serie of benches showed me that 18 is better in this particular
case. Did someone worked on a model allowing to guess what would be
the best value for NMERGE for a given set of data to sort ?
Any hint would be apreciated
regards
Paul
_______________________________________________
Bug-coreutils mailing list
address@hidden
http://mail.gnu.org/mailman/listinfo/bug-coreutils