[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
lynx-dev Recursive dump
From: |
Vieri Di_paola |
Subject: |
lynx-dev Recursive dump |
Date: |
Fri, 14 Jan 2000 17:24:55 +0100 (CET) |
Hi,
Can Lynx dump recursively all the pages of a specified site? For instance,
can I download recursively all the web pages of www.lynx.browser.org and
exclude any other sites like www.whereever.com? I do not wish to use wget
for this task (is wget capable of excluding sites?).
I know that the parameter -localhost disables URLs that point to remote
hosts. How can I fill in the following command line in order to do the
recursive job?
lynx -dump -source -localhost http://www.lynx.browser.org
Or should I use -realm which restricts access to URLs in the starting
realm? What's a realm (excuse my ignorance)?
Should I use -traversal and -crawl?
If what I asked is possible, can I get the same directory structure as in
the remote host?
Regards,
Vieri Di Paola
- lynx-dev Recursive dump,
Vieri Di_paola <=