[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Memory allocation problem with latest gawk
From: |
Aharon Robbins |
Subject: |
Re: Memory allocation problem with latest gawk |
Date: |
Thu, 3 Apr 2003 12:59:55 +0300 |
> Date: Wed, 2 Apr 2003 21:07:45 +0400
> From: Stanislav Ievlev <address@hidden>
> To: Aharon Robbins <address@hidden>
> Cc: address@hidden, Stepan Kasal <address@hidden>
> Subject: Re: Memory allocation problem with latest gawk
>
> On Wed, Apr 02, 2003 at 03:54:20PM +0300, Aharon Robbins wrote:
> > I can't reproduce this under RH Linux 8.0. What system and
> > compiler are you using?
>
> I'm using system (ALT Linux Sisyphus) _with memory limits_ for processes.
> Compiller gcc-3.2.1.
>
> I'm sure you need more big file to reproduce it.
>
> Try like this(or create 100M file):
>
> gawk '/.*/ {;}' /dev/zero
>
> New version eats more memory instead old (memory leak?). It's very bad.
Again, I can't reproduce this. I don't have any memory limits set.
$ dd if=/dev/zero bs=1024 count=500000 of=/tmp/lll1
ls -l /tmp/lll1
500000+0 records in
500000+0 records out
$ ls -l /tmp/lll1
-rw-r--r-- 1 arnold wheel 512000000 Apr 3 12:42 /tmp/lll1
$ gawk-3.1.2 '/.*/ {;}' /tmp/lll1
$
If your system is set up to have memory allocation fail at some point,
then it's not suprising that it dies; /dev/zero doesn't contain a newline,
so gawk has to keep extending the size of the record.
Sorry.
Arnold
Re: Memory allocation problem with latest gawk,
Aharon Robbins <=