help-gnunet
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Help-gnunet] gnunet-check segfaults (broken gdbm files?)


From: Benjamin Kay
Subject: Re: [Help-gnunet] gnunet-check segfaults (broken gdbm files?)
Date: Wed, 10 Dec 2003 20:38:37 +0000
User-agent: KMail/1.5.4

On Wednesday 10 December 2003 03:26 pm, Markku Tavasti wrote:
> Benjamin Kay <address@hidden> writes:
> > I have found that mysql is faster (almost twice as fast on my slow
> > machine)
>
> Meaning my 4 week insert might be run in 2 weeks! I suppose even
> faster, since in insert all the time is spent on waiting seeking
> hd.

Whoa! 4-week insert? Either your computer is very slow, or the data you are 
trying to insert is very large (upwards of a several gigabytes). If neither 
of these is the case, then your slow insert is the result of either an error 
in gnunet or an error in gdbm, and is not a typical insert time. Using gdbm, 
196Mbytes ram, 7200 rpm HD, and 400MHz processor, inserting a 700MB file 
takes me between 12 and 18 yours, not four weeks!!!

In the event that you are inserting a very large file... well, I don't want to 
speak on behalf of the developers, but I don't think GNUnet is intended for 
very large files and I'm not sure if it's capable of handling them. Inserting 
lots of ordinary files (such as songs, software, iso images, etc.) is 
legitimate, and inserting several gigabytes split up into several smaller 
files should work (and could maybe take you a week or so). Are you having 
gnunet-insert recursively insert/index a large directory?

> And if mysql buffers in RAM properly (I have at least 96M), it
> might speed up quite much. Maybe we'll see.

I'm running a machine with 196MB of RAM, and I still don't seem to have enough 
to get mysql to work as well as it should. Consider buying more RAM, 
especially if you plan to have other mysql databases open at the same time as 
gnunet's. 96MB should be enough for basic mysql functionality without 
impairing other typical processes.

> I don't count that much on conversion, since db might be badly
> broken. I expect I have to insert everything again.

Agreed. Just make sure to delete everything under data/afs before the switch 
so that gnunet doesn't insist on looking for the old gdbm databse (I think 
this is this correct).

> And maybe this
> time I use gnunet-insert -n since runnign gnunet-check after removing
> some files will take for ages.

"gnunet-insert -n" will insert, not index, files. This requries more HD space 
and takes longer than simple indexing. Also, it makes it difficult to know 
what files you have already inserted. The only reason I can think of to 
insert rather than indexing is for deniability, or if the original file will 
be deleted.

gnunet-check does not "defragment," clean, or improve a mysql database; this 
is done automatically by mysqld without your having to worry about it. There 
is no need to run gnunet-check unless you suspect the database is damaged. 
Since gnunet-check and gnunetd cannot be running at the same time, I would 
advise against running gnunet-check unless you absolutely have to.

Good luck!





reply via email to

[Prev in Thread] Current Thread [Next in Thread]