Skip to main content

Run multiple subprocesses asynchronous/in parallel with streamed output/non-blocking reading. Also various tools to replace shell scripts.

Project description

shelljob

This provides a clean way to execute subprocesses, either one or multiple in parallel, capture their output and monitor progress:

  • Single sub process call with optional timeout
  • High level FileMonitor to execute several processes in parallel and store output in a file
  • Low level Group execution to execute jobs in parallel and capture output

Additional tools for working with the filesystem are also included:

  • find which offers much of the functionality of the shell find utility
  • shelljob.fs.NamedTempFile provides a with block wrapper for temporary named files

API Documentation

Install

pip install shelljob

Parallel Subprocesses

Using the Job system is the quickest approach to just run processes and log their output (by default in files named '/tmp/job_ID.log')

from shelljob import job

jm = job.FileMonitor()
jm.run([
	[ 'ls', '-alR', '/usr/local' ],
	'my_prog',
	'build output input',
])

An array will passed directly to subprocess.Popen, a string is first parsed with shlex.split.

The lower level Group class provides a simple container for more manual job management.

from shelljob import proc

g = proc.Group()
p1 = g.run( [ 'ls', '-al', '/usr/local' ] )
p2 = g.run( [ 'program', 'arg1', 'arg2' ] )

while g.is_pending():
	lines = g.readlines()
	for proc, line in lines:
		sys.stdout.write( "{}:{}".format( proc.pid, line ) )

Encoding

By default the output will be binary encoding. You can specify an encoding='utf-8' to the run command to use an encoded text stream instead. Be aware that if the encoding fails (the program emits an invalid sequence) the running will be interrupted. You should also use the on_error function to check for this.

Line-endings will always be preserved.

Simple Subprocess calls

A simplified call function allows timeouts on subprocesses and easy acces to their output.

from shelljob import proc

# capture the output
output = proc.call( 'ls /tmp' )
# this raises a proc.Timeout exception
proc.call( 'sleep 10', timeout = 0.1 )

Find

The 'find' funtion is a multi-faceted approach to generating listings of files.

from shelljob import fs

files = fs.find( '/usr/local', name_regex = '.*\\.so' )
print( "\n".join(files) )

Refer to the API docs for all parameters. Just let me know if there is some additional option you need.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for shelljob, version 0.6.2
Filename, size File type Python version Upload date Hashes
Filename, size shelljob-0.6.2-py3-none-any.whl (9.9 kB) File type Wheel Python version py3 Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page