[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [shell-script] Seq literal
From: |
Tiago Barcellos Peczenyj |
Subject: |
Re: [shell-script] Seq literal |
Date: |
Mon, 29 Oct 2007 18:44:46 -0300 |
On 10/29/07, Ivan lopes <address@hidden> wrote:
>
> para o wget, teriamos:
>
> $ wget www.link_0{0..5}.com.br
O "browser" curl tem uma sintaxe interna bem interessante
Do man curl:
You can specify multiple URLs or parts of URLs by writing part sets
within braces as in:
http://site.{one,two,three}.com
or you can get sequences of alphanumeric series by using [] as in:
ftp://ftp.numericals.com/file[1-100].txt
ftp://ftp.numericals.com/file[001-100].txt (with leading zeros)
ftp://ftp.letters.com/file[a-z].txt
No nesting of the sequences is supported at the moment, but you can use
several ones next to each other:
http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html
Since curl 7.15.1 you can also specify step counter for the ranges, so
that you can get every Nth number or letter:
http://www.numericals.com/file[1-100:10].txt
http://www.letters.com/file[a-z:2].txt
...
-o/--output <file>
Write output to <file> instead of stdout. If you are using {} or
[] to fetch multiple documents, you can use '#' followed by a
number in the <file> specifier. That variable will be replaced
with the current string for the URL being fetched. Like in:
curl http://{one,two}.site.com -o "file_#1.txt"
or use several variables like:
curl http://{site,host}.host[1-5].com -o "#1_#2"
You may use this option as many times as you have number of
URLs.
See also the --create-dirs option to create the local directo-
ries dynamically.
--
O curl é tão util quanto o wget, porém é mais indicado para web
scripting enquanto o wget é um gerenciador de downloads (bem parrudo,
inclusive).
--
Tiago B Peczenyj
Linux User #405772
http://peczenyj.blogspot.com/