fab-user
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Fab-user] Saving state? (shared memory space?)


From: Joel Krauska
Subject: [Fab-user] Saving state? (shared memory space?)
Date: Fri, 5 Apr 2013 01:02:51 -0700

Hello fellow fabric users.

During parallel runs, I would like to be able to collect information in some sort of shared memory space to review after I'm done.

This seems to be tricky when running in parallel. (assumably due to the threading)

Example use cases:
tracking success vs failed counts after a run 
stuffing output in a tidy coalesced log after the job runs

I tried stuffing the output in to env, but that's apparently not a shared global space....

Any guidance on how this might be done?
(calling out to a DB/sqllite or even a lockfile wrapped pickle seem like passable solutions, but I'm hoping there's an easier way to collect data during a parallel run and summarize it later...)

Help?

Is there a cleanup method on exit where I should be looking at env?

Thanks,

Joel Krauska




code snippet:

from fabric.api import *
from fabric.colors import red, green

env.mydata={}
env.failcount=0
env.goodcount=0

@task
def mytask():
    with settings(
        hide('warnings', 'running', 'stdout', 'stderr','output'),
        ):
        result=run("uname")
        print '-'*80
        print env.host_string,
        if result.failed:
            print red('FAILED')
            env.failcount+=1
        else:
            print green('OK')
            env.goodcount+=1
        print result
        env.mydata[env.host_string]=str(result)

        print env.mydata

# These aren't holding anything useful...
print env.mydata
print env.failcount
print env.goodcount



reply via email to

[Prev in Thread] Current Thread [Next in Thread]