mldonkey-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Mldonkey-users] Is MLDonkey taking over or stats broken?


From: Logi
Subject: Re: [Mldonkey-users] Is MLDonkey taking over or stats broken?
Date: Tue, 31 Dec 2002 18:03:39 +0100
User-agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.2) Gecko/20021203

>> I don't think that can be changed unless MLDonkey gets an upload
>> queue or client identification is tied to MD4 AND IP.
>
>> From source:
>
> donkeyClient.ml let banned_ips = Hashtbl.create 113
>
> So it looks like IPs are banned, not Client-Hashes.

Yes, if a client is banned his IP gets banned.

But IPs alone are not used in client identification as there can be
several Clients running on one IP on different ports.

After looking at the source [which I should have done sooner :-( ]
it looks like the identification if done via sockets.
E.g. a combination of IP and port number,

   let request_for c file sock =
     if !!ban_queue_jumpers then
       try
         let record = Hashtbl.find old_requests (client_num c, file_num file) in
          if record.last_request +. 540. > last_time () then begin
          ...

So it looks the MD4 is only used in brand identification and instead of
changing MD4 one has to change sockets to queue_jump. As one cannot easily
change ones IP the simple solution is to change the port.

As you can connect every 10m with the same IP/Port you just need additional
port numbers for the time in beetwen. Let's say one wants to connect every 2 
seconds.
10m*60/2=300 So one needs 300 additional ports.

- use port 20001
- poll MLDonkey client
- wait 2s
- use port 20002
- poll MLDonkey client
- wait 2s
and so on until you get an upload slot.
After 10m you start with port 20000 again.

This would appear as 300 additional MLDonkey clients per abuser but
as you've seen several thousand MLDonkeys I'd say the explanation that
client_stats has a bug is more plausibe. ;-)




reply via email to

[Prev in Thread] Current Thread [Next in Thread]