bug-wget
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Bug-wget] wget mirror site failing due to file / directory name cla


From: Tim Ruehsen
Subject: Re: [Bug-wget] wget mirror site failing due to file / directory name clashes
Date: Tue, 16 Oct 2012 10:13:32 +0200
User-agent: KMail/1.13.7 (Linux/3.2.0-4-amd64; KDE/4.8.4; x86_64; ; )

Am Friday 12 October 2012 schrieb Paul Beckett (ITCS):
> I am attempting to use wget to create a mirrored copy of a CMS (Liferay)
> website. I want to be able to failover to this static copy in case the
> application server goes offline. I therefore need the URL's to remain
> absolutely identical. The problem I have is that I cannot figure out how I
> can configure wget in a way that will cope with:
> http://www.example.com/about
> http://www.example.com/about/something

You can't make a failover copy with wget like tools. Maybe except for very 
simple web sites, but a CMS isn't that simple.
On a web server there will be many essential resources that are not available 
via remote access (e.g. scripts, servlets, server configuration, database, 
...).
What I want to say is: Even if you solve this (minor) problem of not being 
able to map URL paths to the local filesystem (a problem that occurs from time 
to time which can generally be solved by transforming the URL into a key/value 
pair. AFAIK, wget doesn't have such a feature yet), you will stumble over the 
next problem that prevents your copy to be a failover copy.

It sounds that you have administrative access to your company's web server.
So why not using any of the thousands of "professional" 
backup/failover/redundancy mechanisms for such use cases ?
E.g. a filesystem and database cluster - today there should be out-of-the-box 
solutions.

But maybe I don't get your intention...

Tim



reply via email to

[Prev in Thread] Current Thread [Next in Thread]