[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] wget with the -i option.

From: Micah Cowan
Subject: Re: [Bug-wget] wget with the -i option.
Date: Wed, 28 Apr 2010 10:51:28 -0700
User-agent: Thunderbird (X11/20100317)

Ray Sterner wrote:
>   Hello Micah,
>   When I use wget to grab all the files from the ftp site they download
>   very quickly (relatively).  That means it's possible to do.

Yes. It's only a design flaw that prevents this from working on a
list-of-URLs. The recursive-descent mode, by its very nature, almost has
to reuse the same connection, but for some reason, the list-of-URLs mode

>   I can see why making a new connection for each file in a list is a
>   reasonable default, they might be scattered all over the web.

If they were, then it would make sense. But it can't be that hard to
save the connection and check to see if we still have an open connection
to the host, just as we do on HTTP. Though I suppose the main difference
there is that HTTP doesn't have to keep track of what the current
working directory is, the way FTP would need to.

>   I guess the get-all-the-files mode must use a single connection for
>   everything on the target site.


>   Maybe a useful new option would be one that tries to use the same
>   connection for as many files as it can in the given list.

IMO, this doesn't need to be a new option. I don't believe anyone
_wants_ the current behavior for URL-lists, and if for some reason they
do, they could just run wget itself in a loop, giving it a single arg at
a time.

It just needs someone to make the change. :)

Micah J. Cowan

reply via email to

[Prev in Thread] Current Thread [Next in Thread]