[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Help-bash] bash suitable for parsing big files?

From: Dennis Williamson
Subject: Re: [Help-bash] bash suitable for parsing big files?
Date: Thu, 12 Sep 2013 20:41:56 -0500

On Sep 12, 2013 7:05 PM, "adrelanos" <address@hidden> wrote:
> Hi,
> I've been using:
> mapfile -t lines < "/var/lib/dpkg/status"
> for line in "address@hidden"; do
> ... (parsing it with things like awk, {var:0:6}, {var,,} and
> pkg_arch[$package]="$arch".) ...
> For those who don't know /var/lib/dpkg/status, it's size is roughly 2 MB
> and contains roughly 50.000 lines.
> Parsing it with bash takes a long time.
> Is there any way to speed it up or is bash not the right tool for
> parsing such big files?
> All the best,
> adrelanos

Reading the whole file into an array for a file that size is the wrong approach. Use while read instead. Also, calling external utilities many times in a loop can be very slow. Consider that many things like awk and grep iterate over the lines in a file for free.

Ultimately, it comes down to "What are you really trying to do?"

reply via email to

[Prev in Thread] Current Thread [Next in Thread]