bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] Support for long-haul high-bandwidth links


From: Andrew Daviel
Subject: Re: [Bug-wget] Support for long-haul high-bandwidth links
Date: Thu, 1 Dec 2011 15:33:25 -0800 (PST)

On Wed, 30 Nov 2011, Daniel Stenberg wrote:

On Wed, 30 Nov 2011, Fernando Cassia wrote:

When downloading a large file over a high-latency (e.g. long physical distance) high-bandwidth link, the download time is dominated by the round-trip time for TCP handshakes.

First off, this early conclusion is incorrect. RTT has basically no impact on an ongoing TCP transfer these days since they introduced large windows for like a decade ago.

I may be wrong, but I thought that to get significant benefit large windows had to be enabled on every router between the source and destination, which I did not think was the case on the public Internet.

Which is why large files should be stored on FTP servers, not http.

I recall that BABARftp and GridFTP have support for multiple threaded downloads, but that regular FTP does not. I'd agree with Daniel - FTP offers no advantage (for serving files) over HTTP and did not support SSL for so long that some orgs deprecated it as insecure.

Re. Axel - thanks for the link FC. I hadn't heard of it. Seems to start an inital full-length transfer then kill it with TCP resets if the server supports ranges. I find wget faster slightly sending a photo from my work to home (in the same city), but axel faster getting a large file from the other side of the world as expected. No DAV/upload ability though.

(I'm not that interested in bypassing download throttling; this was more a thought experiment prompted by a discussion of using WebDAV between Europe and North America)

--
Andrew Daviel, TRIUMF, Canada



reply via email to

[Prev in Thread] Current Thread [Next in Thread]