duplicity-talk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Duplicity-talk] strategy for performing initial full backup to S3


From: Kenneth Loafman
Subject: Re: [Duplicity-talk] strategy for performing initial full backup to S3
Date: Wed, 7 Oct 2015 12:35:53 -0500

I would suggest not to do one large backup, but segmented backups at the top of the directory tree, split however makes sense to you.  This will speed up the backup process since each segment can be done in parallel, will better use the cores and network available, and be much easier to use for recovery.

What version of duplicity and what options are you using?

...Ken


On Wed, Oct 7, 2015 at 10:59 AM, Scott Classen <address@hidden> wrote:
I am attempting to perform the initial full backup of a fairly large volume (3.5TB) to Amazon S3 and it is taking a long time, 8 days so far, and it looks like it might be about 1/2 done, although it is hard to determine because presumably stuff is getting compressed. I'm wondering why it takes so long? Our internet connection is capable of much faster rates and the server that is running the backup has many more cores than are being used. I fear that I may not have the magical combination of duply/duplicity options to get the best performance? or maybe 3.5TB is beyond the capabilities of duplicity? At this rate I won't need to do (or I won't have time to do) any incremental backups... I'll just do a full backup once every two weeks. Any advice would be greatly appreciated.

Scott


_______________________________________________
Duplicity-talk mailing list
address@hidden
https://lists.nongnu.org/mailman/listinfo/duplicity-talk


reply via email to

[Prev in Thread] Current Thread [Next in Thread]