gnunet-developers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [GNUnet-developers] update mechanism


From: Martin Uecker
Subject: Re: [GNUnet-developers] update mechanism
Date: Mon, 19 Aug 2002 14:05:25 +0200
User-agent: Mutt/1.4i

On Sun, Aug 18, 2002 at 05:07:38PM -0500, Christian Grothoff wrote:
> On Sunday 18 August 2002 11:42 am, Martin Uecker wrote:

[...]

> > We can start rating down submitters who provide false
> > meta data which can be automatically checked
> > (SHA-1, MD5 and other). This is more protection against
> > fraud than spam prevention but still a good idea.
> 
> You can *never* rank down. That's the same problem as in negative trust 
> (read: 
> an excess based economy). Submitters are anonymous, but even if they signed 
> the RNode, they can use a pseudonym exactly once, in which case negative 
> ratings are useless (they would only apply after the damage has been done). 
> You can only do positive rankings.

I *do* understand the current system. But I predict that it won't
survive spam. The RIAA seems to be paying firms to use spam
as DOS attack against file sharing networks. Everbody can censor
everthing just by inserting a lot of junk under the same keyword.

(BTW: negative ratings for one time pseudonyms aren't
 useless if they are distributed to other people)

[...]

> >
> > Why pseudonyms? Individuals are best identified by the hashes
> > of their pub keys. Those can't by hijacked because the owner
> > can prove with a signature that he is the owner of his pub
> > key.
> 
> Well, the hash of the public key *is* a pseudonym, as long as there is not a 

Okay.

> public database that links public keys to individuals. And since you can make 
> up new public keys anytime, you can make new pseudonyms at any time - as many 
> as you feel like.

Yes. People might prefer to have many pseudonyms so that
different pseudonymous activities can't be linked together.

But I don't think that a newly created pseudonym should
automatically be trusted enough to insert R blocks which
then appear in the search results of everybody else.

A node should not insert (or return as search result)
R blocks which come from less trusted pseudonyms if there
are R blocks from higher trusted pseudonyms which match
the same keywords. (it might be a good idea to randomly
make exceptions to this rule)

At the same time the user should rate the results locally
using he's personal database. (That is what Igor proposes
too.) But this won't help getting the bad R blocks out
of the network.

Another idea is: The servers expire old blocks after some
time (even when they are requested often.). To keep the good
content on the network the search clients reinsert 
search results which are locally rated high.


Martin

[I won't be able to answer my mail in the next days.]






reply via email to

[Prev in Thread] Current Thread [Next in Thread]