Skip to main content

Run multiple subprocesses asynchronous/in parallel with streamed output/non-blocking reading. Also various tools to replace shell scripts.

Project description

shelljob

This provides a clean way to execute subprocesses, either one or multiple in parallel, capture their output and monitor progress:

  • Single sub process call with optional timeout
  • High level FileMonitor to execute several processes in parallel and store output in a file
  • Low level Group execution to execute jobs in parallel and capture output

Additional tools for working with the filesystem are also included:

  • find which offers much of the functionality of the shell find utility
  • shelljob.fs.NamedTempFile provides a with block wrapper for temporary named files

API Documentation

Install

pip install shelljob

Parallel Subprocesses

Using the Job system is the quickest approach to just run processes and log their output (by default in files named '/tmp/job_ID.log')

from shelljob import job

jm = job.FileMonitor()
jm.run([
	[ 'ls', '-alR', '/usr/local' ],
	'my_prog',
	'build output input',
])

An array will passed directly to subprocess.Popen, a string is first parsed with shlex.split.

The lower level Group class provides a simple container for more manual job management.

from shelljob import proc

g = proc.Group()
p1 = g.run( [ 'ls', '-al', '/usr/local' ] )
p2 = g.run( [ 'program', 'arg1', 'arg2' ] )

while g.is_pending():
	lines = g.readlines()
	for proc, line in lines:
		sys.stdout.write( "{}:{}".format( proc.pid, line ) )

Encoding

By default the output will be binary encoding. You can specify an encoding='utf-8' to the run command to use an encoded text stream instead. Be aware that if the encoding fails (the program emits an invalid sequence) the running will be interrupted. You should also use the on_error function to check for this.

Line-endings will always be preserved.

Simple Subprocess calls

A simplified call function allows timeouts on subprocesses and easy acces to their output.

from shelljob import proc

# capture the output
output = proc.call( 'ls /tmp' )
# this raises a proc.Timeout exception
proc.call( 'sleep 10', timeout = 0.1 )

Find

The 'find' funtion is a multi-faceted approach to generating listings of files.

from shelljob import fs

files = fs.find( '/usr/local', name_regex = '.*\\.so' )
print( "\n".join(files) )

Refer to the API docs for all parameters. Just let me know if there is some additional option you need.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

shelljob-0.6.2-py3-none-any.whl (9.9 kB view details)

Uploaded Python 3

File details

Details for the file shelljob-0.6.2-py3-none-any.whl.

File metadata

  • Download URL: shelljob-0.6.2-py3-none-any.whl
  • Upload date:
  • Size: 9.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.6.9

File hashes

Hashes for shelljob-0.6.2-py3-none-any.whl
Algorithm Hash digest
SHA256 5475168897c989f137e811ec91ddb1e6548b2bef13b62e4cb979b38a40499ab5
MD5 7c593574c3be02eb03259836d81ed78a
BLAKE2b-256 a30ab488d768b74f11011849d0530ce1aab240502e810510361589181e2ff788

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page