Skip to main content

Mpmq is an abstraction of the Python multiprocessing library providing execution pooling and message queuing capabilities.

Project description

mpmq

GitHub Workflow Status vulnerabilities coverage complexity PyPI version python

The mpmq module provides a convenient way to scale execution of a function across multiple input values by distributing the input across a specified number of background processes. It also provides the means for the caller to intercept and process messages from the background processes while they execute the function. It does this by configuring a custom log handler that sends the function's log messages to a thread-safe queue; several API's are provided for the caller to process the messages from the message queue. The number of processes along with the input data for each process is specified as a list of dictionaries. The number of elements in the list dictates the total number of processes to execute. The result of each function is returned as a list to the caller after all background workers complete.

The main features are:

  • execute function across multiple processes
  • queue function execution
  • create log handler that sends function log messages to thread-safe message queue
  • process messages from log message queue
  • maintain result of all executed functions
  • terminate execution using keyboard interrupt

Installation

pip install mpmq

MPmq class

mpmq.MPmq(function, process_data=None, shared_data=None, processes_to_start=None)

function - the function to execute

process_data - list of dictionaries where each dictionary describes the input data that will be sent to each background process executing the function; the length of the list dictates the total number of processes that will be executed

shared_data - a dictionary containing arbitrary data that will be sent to all processes

process_to_start - the number of processes to initially start; this represents the number of concurrent processes that will be running. If the total number of processes is greater than this number then execution will be queued and executed to ensure that this concurrency is maintained

execute(raise_if_error=False)

Start execution the process’s activity. If raise_if_error is set to True, an exception will be raised if any function encountered an error during execution.

process_message(offset, message)

Process a message sent from one of the background processes executing the function. The offset represents the index of the executing Process; this number is the same as the corresponding index within the process_data list that was sent to the constructor. The message represents the message that was logged by the function.

Examples

A simple example using mpmq:

from mpmq import MPmq
import sys, logging
logger = logging.getLogger(__name__)
logging.basicConfig(stream=sys.stdout, level=logging.INFO, format="%(processName)s [%(funcName)s] %(levelname)s %(message)s")

def do_work(*args):
    logger.info(f"hello from process: {args[0]['pid']}")
    return 10 + int(args[0]['pid'])

process_data = [{'pid': item} for item in range(3)]
results = MPmq(function=do_work, process_data=process_data).execute()
print(f"Total items processed {sum([result for result in results])}")

Executing the code above results in the following (for conciseness only INFO level messages are shown):

MainProcess [start_next_process] INFO started background process at offset:0 with id:4430 name:Process-1
Process-1 [do_work] INFO hello from process: 0
MainProcess [start_next_process] INFO started background process at offset:1 with id:4431 name:Process-2
Process-1 [_queue_handler] DEBUG adding 'do_work' offset:0 result to result queue
Process-2 [do_work] INFO hello from process: 1
MainProcess [start_next_process] INFO started background process at offset:2 with id:4433 name:Process-3
MainProcess [start_processes] INFO started 3 background processes
Process-3 [do_work] INFO hello from process: 2
Process-2 [_queue_handler] DEBUG adding 'do_work' offset:1 result to result queue
Process-1 [_queue_handler] DEBUG execution of do_work offset:0 ended
Process-3 [_queue_handler] DEBUG adding 'do_work' offset:2 result to result queue
Process-1 [_queue_handler] DEBUG DONE
MainProcess [complete_process] INFO process at offset:0 id:4430 name:Process-1 has completed
Process-2 [_queue_handler] DEBUG execution of do_work offset:1 ended
Process-2 [_queue_handler] DEBUG DONE
MainProcess [complete_process] INFO joining process at offset:0 with id:4430 name:Process-1
Process-3 [_queue_handler] DEBUG execution of do_work offset:2 ended
Process-3 [_queue_handler] DEBUG DONE
MainProcess [process_control_message] INFO the to process queue is empty
MainProcess [complete_process] INFO process at offset:1 id:4431 name:Process-2 has completed
MainProcess [complete_process] INFO joining process at offset:1 with id:4431 name:Process-2
MainProcess [process_control_message] INFO the to process queue is empty
MainProcess [complete_process] INFO process at offset:2 id:4433 name:Process-3 has completed
MainProcess [complete_process] INFO joining process at offset:2 with id:4433 name:Process-3
MainProcess [process_control_message] INFO the to process queue is empty
MainProcess [run] INFO there are no more active processses - quitting
>>> print(f"Total items processed {sum([result for result in results])}")
Total items processed 33

Projects using mpmq

  • mpcurses An abstraction of the Python curses and multiprocessing libraries providing function execution and runtime visualization capabilities

  • mppbars Scale execution of a function across multiple across a number of background processes while displaying their execution status via a progress bar

  • mp4ansi A simple ANSI-based terminal emulator that provides multi-processing capabilities

Development

Clone the repository and ensure the latest version of Docker is installed on your development server.

Build the Docker image:

docker image build \
-t mpmq:latest .

Run the Docker container:

docker container run \
--rm \
-it \
-v $PWD:/code \
mpmq:latest \
bash

Execute the build:

pyb -X

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mpmq-0.4.0.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

mpmq-0.4.0-py3-none-any.whl (8.8 kB view details)

Uploaded Python 3

File details

Details for the file mpmq-0.4.0.tar.gz.

File metadata

  • Download URL: mpmq-0.4.0.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for mpmq-0.4.0.tar.gz
Algorithm Hash digest
SHA256 68aeae6bfbc12ec050eb9c925053ca238a9806a7c20c511167e2317eef11aec7
MD5 421c6823ff08bef6a83f7c808fb69060
BLAKE2b-256 1bb976619a6f041963eeeda824ec79622256f78f412f43450164a563921a4b17

See more details on using hashes here.

File details

Details for the file mpmq-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: mpmq-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 8.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for mpmq-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1f3becc1611ea0b913b92d0febe4e96fe1dfe14d2faca51ca3e3a30a4c173615
MD5 a1208d162e92b5392b630750acb8feb6
BLAKE2b-256 05bf68801cc396d852f65d41db737cae975751eb23d75ff8b78d2ee3d2fee257

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page