bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] Fwd: Trying to download HTML from Google's Cache. Pls hel


From: Ben Smith
Subject: Re: [Bug-wget] Fwd: Trying to download HTML from Google's Cache. Pls help]
Date: Wed, 12 Nov 2008 21:30:54 -0800 (PST)

grep is a command line program that allows you to find lines in a text file that contain a certain target
more info/usage: http://compute.cnr.berkeley.edu/cgi-bin/man-cgi?grep
sed is a command line program that allows you to replace text
more info/usage: http://compute.cnr.berkeley.edu/cgi-bin/man-cgi?sed

Any Linux distro should have these, or if you're running Windows you can get them at:
http://gnuwin32.sourceforge.net/packages/grep.htm
http://gnuwin32.sourceforge.net/packages/sed.htm



From: Yan Grossman <address@hidden>
To: address@hidden
Sent: Wednesday, November 12, 2008 2:03:58 PM
Subject: Fwd: [Fwd: Re: [Bug-wget] Fwd: Trying to download HTML from Google's Cache. Pls help]



---------- Forwarded message ----------
From: Yan Grossman <address@hidden>
Date: Wed, Nov 12, 2008 at 10:49 AM
Subject: Re: [Fwd: Re: [Bug-wget] Fwd: Trying to download HTML from Google's Cache. Pls help]
To: Micah Cowan <address@hidden>


Thanks so much. But what does it mean "Then grep each of the results files to find the line with links to the

all cached pages.  You can pipe that output into sed"
I am not familiar with "grep" and "sed"

Could you please elaborate?

Thanks

On Wed, Nov 12, 2008 at 10:32 AM, Micah Cowan <address@hidden> wrote:
-------- Original Message --------
Subject: Re: [Bug-wget] Fwd: Trying to download HTML from Google's
Cache. Pls help
Date: Wed, 12 Nov 2008 10:00:34 -0800 (PST)
From: Ben Smith <address@hidden>
To: Micah Cowan <address@hidden>
References: <address@hidden>
<address@hidden>


Adding -UFirefox allows the download.  So you should first wget
-UFirefox all the listed results pages from Google:
http://www.google.com/search?q=site%3Awww.snowbrasil.com%2Ffotos&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a
http://www.google.com/search?hl=en&client=firefox-a&rls=org.mozilla:en-US:official&hs=zle&q=site:www.snowbrasil.com/fotos&start=10&sa=N
http://www.google.com/search?hl=en&client=firefox-a&rls=org.mozilla:en-US:official&hs=o6J&q=site:www.snowbrasil.com/fotos&start=20&sa=N

etc., up to start=570 (since there are 577 results).

Then grep each of the results files to find the line with links to the
all cached pages.  You can pipe that output into sed, which you can use
to remove everything but the links to the cached pages (replace the info
before, after, and between the cache links with a space).  Then simply
pipe that to wget -UFirefox, and you should get all your files.



----- Original Message ----
> From: Micah Cowan <address@hidden>
> To: Ben Smith <address@hidden>
> Cc: address@hidden
> Sent: Tuesday, November 11, 2008 3:27:05 PM
> Subject: Re: [Bug-wget] Fwd: Trying to download HTML from Google's Cache. Pls help
>
> Ben Smith wrote:
>
>> Subject: Re: [Bug-wget] Re: Bug-wget Digest, Vol 1, Issue 10
>
>>> When replying, please edit your Subject line so it is more specific
>>>  than "Re: Contents of Bug-wget digest..."
>
> It's helpful if you adhere to this guideline; otherwise it's hard to
> follow threads. (I've fixed the subject in my reply.)
>
>> It would be theoretically possible by using grep and sed to strip out
>> the links to the cached files and piping that to wget.  However,
>> Google appears to block access to results pages and cached pages via
>> wget.  I tried to download several using wget and got a 403 Forbidden
>> response.
>
> http://wget.addictivecode.org/FrequentlyAskedQuestions#not-downloading
> should be helpful for such problems (using -U is the most applicable
> suggestion, but you may also run into the others). Please also consider
> adding --limit-rate or --wait.
>

--
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Maintainer: wget, screen, teseq
http://micah.cowan.name/






reply via email to

[Prev in Thread] Current Thread [Next in Thread]