bug-coreutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: cut fails with "cut: memory exhausted" when taking a large slice


From: Jim Meyering
Subject: Re: cut fails with "cut: memory exhausted" when taking a large slice
Date: Wed, 21 Apr 2004 23:02:12 +0200

Mordy Ovits <address@hidden> wrote:

> On Wednesday 21 April 2004 03:35 pm, Jim Meyering wrote:
>> If you want to continue using cut, you'll have better
>> luck with the latest:
>>
>>   ftp://ftp.gnu.org/gnu/coreutils/coreutils-5.2.1.tar.gz
>>   ftp://ftp.gnu.org/gnu/coreutils/coreutils-5.2.1.tar.bz2
>
> Excellent.  Will do.
>
>> Or, just use head and tail with their --bytes=N options.
>>
>> e.g., head --bytes=N < FILE | tail --bytes=412569600
>>
>> where N is chosen so that the head command outputs everything
>> in the file up to and including the desired range of bytes.
>
> Sure, but that reads the whole file and pushes it through the pipe.  That's
> far more IO than is strictly necessary.  This is especially true with a 9GB
> file.

dd is probably the best choice, then.

Using two separate processes is best, unless you can
find a reasonably large input block size that evenly divides both
the initial offset and the number of bytes you want to output.

Then dd will use lseek to skip past the initial 1*NUM bytes.

( dd ibs=1 skip=N_SKIP count=0 && dd ibs=4096 count=100725 ) < BIG > out

since 4096 * 100725 == 412569600.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]