[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Duplicity-talk] Duplicity is slow on many small files
From: |
edgar . soldin |
Subject: |
Re: [Duplicity-talk] Duplicity is slow on many small files |
Date: |
Tue, 7 Feb 2017 11:56:21 +0100 |
User-agent: |
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:45.0) Gecko/20100101 Thunderbird/45.7.0 |
On 07.02.2017 11:46, Fazekas László via Duplicity-talk wrote:
> Hi!
>
> I'm using duplicity to backup my webhosting server. My www directory is 17G,
> and it contains many many small files. A full backup to amazon S3 is >1 day.
> Is this a normal running time for a full backup?
what's your duplicity version?
try a test backup to a local file:// target and see how long that takes in
comparison.
>Creating tar from the www folder is 12min. Is it a good idea to make a tar
>before backup?
only if you are willing to untar manually on restore of course
>Will it speed up the backup process?
if the small files are the issue, then probably yes.
>Will duplicity store only modifications of the tar?
yes, but as we are using librsync it will look at it chunk by chunk and if say
in the beginning a file got bigger all content afterwards is offset and will be
regarded as changed.
>Or have you any other idea to speeding up the backup?
primarily using the latest duplicity from the website. there was an issue some
month ago that slowed down backups.
..ede/duply.net