I had this use case where I needed to retrieve the output of a command on multiple servers as part of a python library.

At first, I did it with paramiko and it worked well but as the amount of servers grew, it took too much time to run.
I knew about fabric and read the docs in hope of finding a solution - which I did: the @parallel decorator.

I didn’t want to deal with a fabfile and the fab binary, I wanted to include this in my library - fabric can be used as a library but it’s not extremely well documented.

Here’s how I ended up doing it:

from fabric.api import *

class ParallelCommands():
    def __init__(self, **args):
        self.hosts = args['hosts']
        self.command = args['command']

    @parallel(pool_size=10) # Run on as many as 10 hosts at once
    def parallel_exec(self):
        return run(self.command)

    def capture(self):
        with settings(hide('running', 'commands', 'stdout', 'stderr')):
            stdout = execute(self.parallel_exec, hosts=self.hosts)
        return stdout

hosts = ['[email protected]', '[email protected]']
command = 'uname -a'

instance = ParallelCommands(hosts=hosts, command=command)
output = instance.capture()

The output of each server is inside a dictionary:
{ '[email protected]': 'output', '[email protected]': 'output' }

print output['[email protected]']  


comments powered by Disqus