help-cfengine
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: Optimizing large copies


From: Martin, Jason H
Subject: RE: Optimizing large copies
Date: Fri, 23 Sep 2005 08:43:24 -0700

Just thinking about it some more, but the process could go like:

1. Identify the directory being searched
2. Send a request to cfservd to gather the stats / md5sums (as
appropriate) for the source directory and contents
3. Gather the md5sums / stats for the directory and contents
4. Request the results from the server
5. Compare and take necessary action

That way both the server and client are performing the stat and sum
operations in parallel, and network transactions are reduced. I don't
see how this would compromise security, but I think it would speed up
large transactions.

Thanks,
-Jason Martin

> -----Original Message-----
> From: 
> help-cfengine-bounces+jason.h.martin=cingular.com@gnu.org 
> [mailto:help-cfengine-bounces+jason.h.martin=cingular.com@gnu.
> org] On Behalf Of Martin, Jason H
> Sent: Friday, September 23, 2005 8:10 AM
> To: help-cfengine@gnu.org
> Subject: RE: Optimizing large copies
> 
> 
> While I agree that security takes longer, I'd suggest that a 
> lot of the time is spent on the highly syncronous nature of 
> the network sync operation. Looking at strace output, the 
> procedure appears to be something like:
> 
> 1: stat a file
> 2. checksum a file
> 3. send request to master server to SYNC file
> 4. server checksums file
> 5. server returns results
> 6. client compares.
> 
> I think performance would be increased significantly if #3 
> was somehow batched up into blocks larger then 1.  Perhaps at 
> every directory level, all the files in the directory 
> (non-recursively) could be batched up and send in a single 
> request, without sacrificing security.
> 
> Thank you,
> -Jason Martin
> 
> > -----Original Message-----
> > From: Mark Burgess [mailto:Mark.Burgess@iu.hio.no]
> > Sent: Friday, September 23, 2005 7:05 AM
> > To: Baker, Darryl
> > Cc: Martin, Jason H; help-cfengine@gnu.org
> > Subject: RE: Optimizing large copies
> > 
> > 
> > 
> > I concur with this. Cfengine insists on the correctness and
> > security of each operation -- and that takes time. A program 
> > like rsync can achieve significantly better performance in 
> > large copies if you have a trusted base.
> > 
> > M
> > 
> > On Fri, 2005-09-23 at 08:59 -0400, Baker, Darryl wrote:
> > > I understand your concerns but with the number and size of
> > the files
> > > you are talking about you really should investigate a
> > program that is
> > > optimized for files copying. While cfengine can maintain
> > binary files
> > > this is not its primary focus. Its main focus is maintaining the
> > > system configuration. Rsync was designed to keep files 
> identical on 
> > > several machines. Using rsync in conjunction with ssh for 
> > security you
> > > end up with a much better solution for maintaining binary
> > files. It is
> > > optimized just for this purpose. The use of rsync with ssh is well
> > > documented and in use in a large number of large sites.
> > > 
> > > 
> > > _______________________________________________
> > > Help-cfengine mailing list
> > > Help-cfengine@gnu.org
> > > http://lists.gnu.org/mailman/listinfo/help-cfengine
> > 
> > 
> 
> 
> _______________________________________________
> Help-cfengine mailing list
> Help-cfengine@gnu.org 
> http://lists.gnu.org/mailman/listinfo/help-> cfengine
> 




reply via email to

[Prev in Thread] Current Thread [Next in Thread]