Skip to main content

An async subprocess manager

Project description

A package for asynchronous subprocess pipelines:

import reel

tracks = ['track01', 'track02', 'http://w.w.w/track03']
plst = reel.Reel(
    [reel.cmd.ffmpeg.read(_) for _ in tracks],
    announce_to=print
)
dest = reel.cmd.sox.speakers()
norm = reel.cmd.ffmpeg.norm()
vol = reel.cmd.ffmpeg.volume(1.0)

async with plst | norm | vol | dest as transport:
    await transport.play()

Motivation

This project is a simplified version of Python subprocess control with pipes and asynchronous support. It is being developed to support a music streaming package which uses ffmpeg and other shell commands to get music from various sources to various destinations.

Mythology

The end goal is something like this:

import reel

reel.Spool

Logging

reel will log useful messages to a file called reel.log if you configure a log level (e.g. bash):

$ export REEL_LOGGING_LEVEL='INFO'

That’s all you need to set.

Available log levels, ranked by verbosity, with DEBUG the most verbose, are:

  • DEBUG - mostly useless information

  • INFO - mostly useful information

  • WARNING - might be a problem: suitable default for production

  • ERROR - something bad happened

  • CRITICAL - rare, show-stopping malfunction

  • NOTSET - the default: no logging

By default, reel places the log file in $XDG_DATA_HOME. If $XDG_DATA_HOME is not set, reel chooses a suitable default directory. To view the choice, ask reel to print the current configuration (e.g. bash):

$ reel --config | grep LOGGING

For direct control, explicitly set the logging directory with:

$ export REEL_LOGGING_DIR='~/.local/share/reel'

In addition to sending useful information to reel.log, you can reel also logs output produced by subprocesses. A subprocess can generate log files in two ways:

  1. The process might write it’s own log file (e.g. a web server).

    In this case, reel might be able to control where the log file is written if the command is configured in reel.cmd. For example, reel.cmd.icecast will automatically write it’s server log file to $REEL_LOGGING_DIR / icecast.log.

  2. You might decide to log stderr and/or stdout from a subprocess.

    You can decide what to do with any subprocess output, including logging it all to a file…

In general, reel attempts to keep all log files in one directory and will sparingly create subdirectories if needed.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

reel-0.0.5.tar.gz (15.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

reel-0.0.5-py2.py3-none-any.whl (20.4 kB view details)

Uploaded Python 2Python 3

File details

Details for the file reel-0.0.5.tar.gz.

File metadata

  • Download URL: reel-0.0.5.tar.gz
  • Upload date:
  • Size: 15.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.6.2 requests-toolbelt/0.8.0 tqdm/4.29.1 CPython/3.7.2

File hashes

Hashes for reel-0.0.5.tar.gz
Algorithm Hash digest
SHA256 3e872fa1bfaa3f963480ba228aed86dd5e7003864c1d8b7561a7d228cfa22d4a
MD5 22354c036274d8b309fd711059236315
BLAKE2b-256 f14180bf987f16538d285afaa59f64aaca15755b7b039ee45eff0311415d7d40

See more details on using hashes here.

File details

Details for the file reel-0.0.5-py2.py3-none-any.whl.

File metadata

  • Download URL: reel-0.0.5-py2.py3-none-any.whl
  • Upload date:
  • Size: 20.4 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.6.2 requests-toolbelt/0.8.0 tqdm/4.29.1 CPython/3.7.2

File hashes

Hashes for reel-0.0.5-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 df5a64f6ee594f3a7db93506ff43e46d215bb5eb278b311816f79b26bffd2432
MD5 070130fcffa9dbecc24da171156b0344
BLAKE2b-256 024492544832ace9e6d1be94e9b91d4894729d4f1286dbc239de9bd4177ba194

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page