[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gnump3d-users] Out of memory when indexing files

From: Steve Kemp
Subject: Re: [Gnump3d-users] Out of memory when indexing files
Date: Tue, 6 Jul 2004 23:43:23 +0100
User-agent: Mutt/1.3.28i

On Tue, Jul 06, 2004 at 03:38:29PM -0700, darren david wrote:

> thanks for the quick response. I gave it a shot, and it dumped out at 
> 5601 files with the same "Out of memory!" error. I tried reducing the 
> limit to 1000 files which seemed to fix the memory error, however it 
> overwrote the existing tag cache every time it looped through. :(
> I see what you're getting at though, any other obvious trick i might try?

  Well if you remove the existing cache file, then you could change the
 following line:

    open ( OUT, ">$cache" ) or $error = 1;

  at the top of the function 'indexFiles' to read:

    open ( OUT, ">>$cache" ) or $error = 1;

  ('>' becomes '>>').

  That should cause the future entries to be appended rather than
 overwriting the old ones.

  I can see that I'm going to have to think through a real solution
 though...  Grr!  ;)

# The Debian Security Audit Project.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]