bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] Support for long-haul high-bandwidth links


From: Ángel González
Subject: Re: [Bug-wget] Support for long-haul high-bandwidth links
Date: Sat, 26 Nov 2011 17:40:22 +0100
User-agent: Thunderbird

On 10/11/11 03:24, Andrew Daviel wrote:
>
> When downloading a large file over a high-latency (e.g. long physical
> distance) high-bandwidth link, the download time is dominated by the
> round-trip time for TCP handshakes.
>
> In the past tools such as bbftp have mitigated this effect by using
> multiple streams, but required both a special server and client.
>
> Using the "range" header in HTTP/1.1, it is possible to start multiple
> simultaneous requests for different portions of a file using a
> standard Apache server, and achieve a significant speedup.
> I have a proof-of-principle Perl script using threads which was able
> to download a medium-sized file from Europe to Vancouver in half the
> normal time.
>
> I wondered it this was of interest as an enhanscement for wget.
>
> regards

I think setting a big SO_RCVBUF should also fix your issue, by using big
window sizes, and it's cleaner.
OTOH, you need support from the TCP stack, and that won't trick
per-connection rate limits that may be
limiting you in the single-connection case.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]