bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] Guidance needed


From: Tim Ruehsen
Subject: Re: [Bug-wget] Guidance needed
Date: Wed, 11 Nov 2015 14:46:43 +0100
User-agent: KMail/4.14.10 (Linux/4.2.0-1-amd64; KDE/4.14.13; x86_64; ; )

Hi Ronald,

try to add -H/--span-hosts in case your web page includes content from other 
hosts (e.g. images).

Also -e robots=off might be helpful in case some content is 'forbidden' for 
wget.

If it still doesn't display correctly there might be some Javascript'ing 
involved or some other dynamic web page generation mechanisms.
Cookies might also influence the appearance of the web site.

Tim

On Tuesday 10 November 2015 13:50:25 Ronald F. Guilmette wrote:
> I've tried a few times now to make local (archive) copies of
> various single web pages on various web sites that I'm afraid
> might disappear in the future... and that I really want to make
> a permanent record of... but the local copies of the pages don't
> seem to always display properly.
> 
> I'm using just "wget -p -k <<URL>>" to make the local copies.
> What should I try adding to that in order to make the local
> copies more likely to display properly?
> 
> (I tried using both -r and a limit specified with -l on at least
> one of these cases, but that resulted in wget gobbling down a
> seemingly endless amount of other stuff that wasn't even directly
> related to the one page that I wanted to capture.)



reply via email to

[Prev in Thread] Current Thread [Next in Thread]