circle-discuss
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [circle] Many notes on my Circle experience


From: Asheesh Laroia
Subject: Re: [circle] Many notes on my Circle experience
Date: Mon, 15 Sep 2003 20:27:05 -0400 (EDT)

On Tue, 16 Sep 2003, Jiri Baum wrote:

> Asheesh:
> > What would be the easiest way for clients to determine the size in
> > files of the network, if we can assume that never will more than 400
> > people be connected?
>
> Easiest formula:
>
>      number of files I'm responsible for
> ---------------------------------------------
>   fraction of hashtable I'm responsible for
>
> Not the most accurate number, but should be a reasonable guess, it's
> easy to calculate, and works regardless of the number of peers.

Would this count unique files?  So there are two ways to count something.
Let's ask, "How many words are there in Alice in Wonderland?"

The answer is either "12,000" or "500" (or so).  I don't want the number
of unique files, I just want the raw count.

Actually, I want the total amount of data in GiB or terabytes or so on.

> > And can file hashing be serialized or saved SOMEHOW between starts of
> > the program?  Perhaps based on a combination of file size,
> > modified-time, and name.  We are unlikely to have people TRY to defeat
> > this in our network, and we could have random file rehashing perhaps.
>
> Anyone who tries to defeat things could just wait until the hashing ends
> and defeat it then, anyway - nothing new here.

Yes.  This is a strong feature request, because I think a massive part of
the win32 client being absurdly slow sometimes is unintelligently
resource-intensive hashing.  (But that brings me to my next email...)

If no one else will write this, and the rest of you can agree on a
straightforward design for it, I might even hack it out myself.

-- Asheesh.

-- 
<zpx> it's amazing how "not-broken" debian is compared to slack and rh




reply via email to

[Prev in Thread] Current Thread [Next in Thread]