gnump3d-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gnump3d-users] Out of memory when indexing files


From: darren david
Subject: Re: [Gnump3d-users] Out of memory when indexing files
Date: Tue, 06 Jul 2004 15:38:29 -0700
User-agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8a1) Gecko/20040520

Hi Steve-

thanks for the quick response. I gave it a shot, and it dumped out at 5601 files with the same "Out of memory!" error. I tried reducing the limit to 1000 files which seemed to fix the memory error, however it overwrote the existing tag cache every time it looped through. :(

I see what you're getting at though, any other obvious trick i might try?

thanks,
d.


Steve Kemp wrote:
On Tue, Jul 06, 2004 at 03:11:45PM -0700, darren david wrote:


this is my first post, so i just wanted to start wit ha little bit of praise. i've been using gnump3d for several months now and it has been *fantastic*. thanks for pulling this together. i'd love to write a flash client for it someday... ;)


  Thanks - always nice to hear people are using the code :)


so, on to my question. I'm running gnump3d 2.8 on OpenBSD 3.5-current. I've got 512MB of RAM, and approximately 28,000 MP3 files. Gnump3d can index ~15,900 of those files before quitting with an "Out of memory!" error. 'gnump3d-index --stats' gives me:

 Total number of songs: 15914
 Total size of archive: 99Gb (106308641063 bytes)
 Total playlength     : 60 days, 6 hours, 52 mins 51 seconds

Have i encountered a limitation of the program, or of my hardware? Either way, is there a workaround?


  It's not a limit of the program as such, but I think you're hitting
 an out of memory error due to the inefficient coding of the indexer.
Essentially what happens is the code :

        1.  recursively builds up a list of all the files beneath
           your root.
        2.  Uses this list to process the files one by one to extract
           the tags to an index.

  I see a quick hack which you could use; if there are more than
 say 10,000 tracks immediatly stop finding more and write out
 the file tags, then continue.

  If you're able to experiment I could see something like this working:

sub findAudio()
{
    my ( $file ) = $File::Find::name;

    if ( $DEBUG )
    {
      print $file . "\n";
    }

    return if ( ! isAudio( $file ) );
    return if ( -z $file );

    if ( $#FOUND > 10000 )
    {
        &indexFiles();
        @FOUND = ();
    }
    else
    {
      push @FOUND, $file;
    }
)

  If you could try replacing the existing subroutine with that and
giving it a go I'd appreciate it. Hopefully it's clear what's happening..

Steve
---
Edinburgh System Administrator : Linux, UNIX, Windows
Looking for an interesting job : http://www.steve.org.uk/


!DSPAM:40eb23d8301623777713055!






reply via email to

[Prev in Thread] Current Thread [Next in Thread]