Skip to main content

Mpmq is an abstraction of the Python multiprocessing library providing execution pooling and message queuing capabilities.

Project description

mpmq

GitHub Workflow Status Code Coverage Code Grade PyPI version python

Mpmq is an abstraction of the Python multiprocessing library providing execution pooling and message queuing capabilities. Mpmq can scale execution of a specified function across multiple background processes. It creates a log handler that sends all log messages from the running processes to a thread-safe queue. The main process reads the messages off the queue for processing. The number of processes along with the arguments to provide each process is specified as a list of dictionaries. The number of elements in the list will dictate the total number of processes to execute. The result of each function is read from the result queue and written to the respective dictionary element upon completion.

The main features are:

  • execute function across multiple processes
  • queue function execution
  • create log handler that sends function log messages to thread-safe message queue
  • process messages from log message queue
  • maintain result of all executed functions
  • terminate execution using keyboard interrupt

Installation

pip install mpmq

Examples

A simple example using mpmq:

from mpmq import MPmq
import sys, logging
logger = logging.getLogger(__name__)
logging.basicConfig(
    stream=sys.stdout,
    level=logging.INFO,
    format="%(asctime)s %(processName)s [%(funcName)s] %(levelname)s %(message)s")

def do_work(*args):
    logger.info(f"hello from process {args[0]['pid']}")
    return 10 + int(args[0]['pid'])

process_data = [{'pid': item} for item in range(3)]
MPmq(function=do_work, process_data=process_data).execute()
print(f"Total items processed {sum([item['result'] for item in process_data])}")

Executing the code above results in the following (for conciseness only INFO level messages are shown):

MainProcess [start_next_process] INFO started background process at offset:0 with id:1967 name:Process-1
MainProcess [start_next_process] INFO started background process at offset:1 with id:1968 name:Process-2
Process-1 [do_work] INFO hello from process 0
MainProcess [start_next_process] INFO started background process at offset:2 with id:1969 name:Process-3
MainProcess [start_processes] INFO started 3 background processes
Process-1 [_queue_handler] DEBUG adding 'do_work' offset:0 result to result queue
Process-1 [_queue_handler] DEBUG execution of do_work offset:0 ended
Process-1 [_queue_handler] DEBUG DONE
MainProcess [remove_active_process] INFO process at offset:0 id:1967 name:Process-1 has completed
Process-2 [do_work] INFO hello from process 1
Process-3 [do_work] INFO hello from process 2
Process-2 [_queue_handler] DEBUG adding 'do_work' offset:1 result to result queue
Process-3 [_queue_handler] DEBUG adding 'do_work' offset:2 result to result queue
Process-2 [_queue_handler] DEBUG execution of do_work offset:1 ended
Process-2 [_queue_handler] DEBUG DONE
Process-3 [_queue_handler] DEBUG execution of do_work offset:2 ended
Process-3 [_queue_handler] DEBUG DONE
MainProcess [process_control_message] INFO the to process queue is empty
MainProcess [remove_active_process] INFO process at offset:1 id:1968 name:Process-2 has completed
MainProcess [process_control_message] INFO the to process queue is empty
MainProcess [remove_active_process] INFO process at offset:2 id:1969 name:Process-3 has completed
MainProcess [process_control_message] INFO the to process queue is empty
MainProcess [run] INFO there are no more active processses - quitting
MainProcess [join_processes] INFO joined process at offset:0 with id:1967 name:Process-1
MainProcess [join_processes] INFO joined process at offset:1 with id:1968 name:Process-2
MainProcess [join_processes] INFO joined process at offset:2 with id:1969 name:Process-3
>>> print(f"Total items processed {sum([item['result'] for item in process_data])}")
Total items processed 33

Projects using mpmq

  • mpcurses An abstraction of the Python curses and multiprocessing libraries providing function execution and runtime visualization capabilities

  • mp4ansi A simple ANSI-based terminal emulator that provides multi-processing capabilities.

Development

Clone the repository and ensure the latest version of Docker is installed on your development server.

Build the Docker image:

docker image build \
-t \
mpmq:latest .

Run the Docker container:

docker container run \
--rm \
-it \
-v $PWD:/code \
mpmq:latest \
/bin/sh

Execute the build:

pyb -X

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mpmq-0.2.0.tar.gz (7.2 kB view details)

Uploaded Source

Built Distribution

mpmq-0.2.0-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file mpmq-0.2.0.tar.gz.

File metadata

  • Download URL: mpmq-0.2.0.tar.gz
  • Upload date:
  • Size: 7.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for mpmq-0.2.0.tar.gz
Algorithm Hash digest
SHA256 3f4872b1325cabeaa786e8d9a99123c5fd508b4b25466ee63ec12d0dc677aab1
MD5 e55597a581237a31be1f298a85578020
BLAKE2b-256 4b4f87625187285811e3e0b6c3bd55246ed3088c42950eff3561576557f520c1

See more details on using hashes here.

Provenance

File details

Details for the file mpmq-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: mpmq-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 8.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for mpmq-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2f230738a38e093646dfca3d47312f14442cc086a70ffd829632ceff7ee79e7a
MD5 d70a7a1f8efa8145611c3bf141b9cecb
BLAKE2b-256 b3ec35decde11aba307445e42a9b56df4392fb090aa93c480367f94ec2c62ece

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page