Interprocess Synchronization of Remote Scripts

Problem

You have a script that you want to run on some remote server, but the script shouldn't be run concurrently. If the script is running and it's triggered again, the second execution should wait until the first is finished.

Solution

We're going to use the fasteners python library to create a file lock that can be used to make sure our script runs one at a time on our target server. This will be implemented as a CloudBolt plugin, so the remote script will be embedded in our python script.

Let's start by installing the fasteners library on the CloudBolt server with:

$ pip install fasteners

The following example shows how to utilize the fasteners library to enforce interprocess synchronization using a file lock. This example assumes the job object has a server set, so this would be run on every server associated with a CloudBolt job. This could be easily modified by looking up a server instance to run this on a dedicated remote script host.

import fasteners

from settings import VARDIR

REMOTE_SCRIPT = '''
echo "Hello World"
'''


@fasteners.interprocess_locked('{}/run/cb_job_lock'.format(VARDIR))
def run(job, logger=None, **kwargs):
    for svr in job.server_set.all():
        svr.execute_script(
            script_contents=REMOTE_SCRIPT
        )

    return '', '', ''

 

Have more questions? Submit a request

1 Comments

  • 0
    Avatar
    Nils Vogels

    This would assume there is only one CB server at a time executing jobs, since the lock is set on a filesystem object that is not shared between the CB servers, right?

Please sign in to leave a comment.