duplicity-talk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Duplicity-talk] gpg: too many files open


From: Nate Eldredge
Subject: Re: [Duplicity-talk] gpg: too many files open
Date: Thu, 14 Feb 2019 15:44:57 -0700 (MST)
User-agent: Alpine 2.21 (DEB 202 2017-01-01)

On Thu, 14 Feb 2019, Wolfgang Rohdewald via Duplicity-talk wrote:

On Do, 2019-02-14 at 22:29 +0100, Wolfgang Rohdewald via Duplicity-talk wrote:
I am getting this on restoring. Looking at /proc I find
66 gpg processes, and each of them has about 260 open pipes.

So that opened about 17000 pipes.

Any ideas?

since I urgently needed the data, I drastically increased ulimit -n.

I believe the problem is that I have a LOT of small incremental
backups (one every work hour), 392 incrementals since last full backup.

And I believe duplicity calls GnuPG.run() for each and every inc
backup and releases all of them only when ending the process.
I added some debug output and got:

Yeah, I think this is expected behavior. When restoring, duplicity wants to read through (and decrypt) all the incrementals simultaneously. The idea is that since the diffs to the files are stored in sorted order, by doing this you get to apply all the diffs to get the final version of a single file before moving on to the next.

The alternative would be to do a full restore of one incremental, and then patch all the files to get the next one, and so on. One disadvantage would be that you would need enough disk space for the largest of all the incrementals, instead of just for the final one.

The solution is as you found: if you have a very long chain of incrementals, you'll have to arrange for a higher file limit when you restore.

--
Nate Eldredge
address@hidden




reply via email to

[Prev in Thread] Current Thread [Next in Thread]