help-gnunet
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Help-gnunet] how fast should gnunet be?


From: Igor Wronsky
Subject: Re: [Help-gnunet] how fast should gnunet be?
Date: Sun, 1 Sep 2002 02:08:01 +0300 (EEST)

On Fri, 30 Aug 2002, Christian Muellner wrote:

> I asked some days before how about content migration.
> I have 10 GB and now I want 5 GB for my Indexing/Inserting (so I am sure the 
> content is available (If I am online) an never get old, like freenet I guess).
> And the other 5 GB I want for data migration. So the content would be MUCH 
> more available and GNUnet would get more attractive.
> This should be always changeable to be flexible!
> At the moment it's hard to get just a single small file out of it.

Have you reconfigured gnunet to support such a large amount? I think
that has to be done manually. 

My personal experience is that if a file is currently in gnunet, 
it comes through. It might happen that the original inserter 
has fled before anybody has successfully downloaded the
file, and thus its available only partly. Also, some people
might have inserted the content during a time when the
datastore code had a bug allowing blocks to disappear,
or even worse, they might still run old code.

If you feel in doubt of gnunet working or having some bug,
try the two scripts in contrib/, "junkinsert.sh" and "junklookup.sh"
which are enough to test the basic functionality by harmless and
fresh (previously undownloaded by you) content. For example, I've 
always been able to download the daily piece of junk inserted by
Christian, who runs a 24h node. Try to fetch some, or insert 
a some of your own and ask if anyone can get it.


I.







reply via email to

[Prev in Thread] Current Thread [Next in Thread]