bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] --page-requisites and robot exclusion issue


From: Giuseppe Scrivano
Subject: Re: [Bug-wget] --page-requisites and robot exclusion issue
Date: Mon, 05 Dec 2011 14:41:26 +0100
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/24.0.92 (gnu/linux)

Paul Wratt <address@hidden> writes:

> if it does not obey - server admins will ban it
>
> the work around:
> 1) get single html file first - edit out meta tag - re-get with
> --no-clobber (usually only in landing pages)
> 2) empty robots.txt (or allow all - search net)
>
> possible solutions:
> A) command line option
> B) ./configure --disable-robots-check

you can specify -e robots=off to wget at runtime.

Giuseppe



reply via email to

[Prev in Thread] Current Thread [Next in Thread]