Skip to main content

Mpmq is an abstraction of the Python multiprocessing library providing execution pooling and message queuing capabilities.

Project description

mpmq

GitHub Workflow Status Code Coverage Code Grade vulnerabilities PyPI version python

The mpmq module provides a convenient way to scale execution of a function across multiple input values by distributing the input across a specified number of background processes. It also provides the means for the caller to intercept and process messages from the background processes while they execute the function. It does this by configuring a custom log handler that sends the function's log messages to a thread-safe queue; several API's are provided for the caller to process the messages from the message queue. The number of processes along with the input data for each process is specified as a list of dictionaries. The number of elements in the list dictates the total number of processes to execute. The result of each function is returned as a list to the caller after all background workers complete.

The main features are:

  • execute function across multiple processes
  • queue function execution
  • create log handler that sends function log messages to thread-safe message queue
  • process messages from log message queue
  • maintain result of all executed functions
  • terminate execution using keyboard interrupt

Installation

pip install mpmq

MPmq class

mpmq.MPmq(function, process_data=None, shared_data=None, processes_to_start=None)

function - the function to execute

process_data - list of dictionaries where each dictionary describes the input data that will be sent to each background process executing the function; the length of the list dictates the total number of processes that will be executed

shared_data - a dictionary containing arbitrary data that will be sent to all processes

process_to_start - the number of processes to initially start; this represents the number of concurrent processes that will be running. If the total number of processes is greater than this number then execution will be queued and executed to ensure that this concurrency is maintained

execute(raise_if_error=False)

Start execution the process’s activity. If raise_if_error is set to True, an exception will be raised if any function encountered an error during execution.

process_message(offset, message)

Process a message sent from one of the background processes executing the function. The offset represents the index of the executing Process; this number is the same as the corresponding index within the process_data list that was sent to the constructor. The message represents the message that was logged by the function.

Examples

A simple example using mpmq:

from mpmq import MPmq
import sys, logging
logger = logging.getLogger(__name__)
logging.basicConfig(stream=sys.stdout, level=logging.INFO, format="%(processName)s [%(funcName)s] %(levelname)s %(message)s")

def do_work(*args):
    logger.info(f"hello from process: {args[0]['pid']}")
    return 10 + int(args[0]['pid'])

process_data = [{'pid': item} for item in range(3)]
results = MPmq(function=do_work, process_data=process_data).execute()
print(f"Total items processed {sum([result for result in results])}")

Executing the code above results in the following (for conciseness only INFO level messages are shown):

MainProcess [start_next_process] INFO started background process at offset:0 with id:4430 name:Process-1
Process-1 [do_work] INFO hello from process: 0
MainProcess [start_next_process] INFO started background process at offset:1 with id:4431 name:Process-2
Process-1 [_queue_handler] DEBUG adding 'do_work' offset:0 result to result queue
Process-2 [do_work] INFO hello from process: 1
MainProcess [start_next_process] INFO started background process at offset:2 with id:4433 name:Process-3
MainProcess [start_processes] INFO started 3 background processes
Process-3 [do_work] INFO hello from process: 2
Process-2 [_queue_handler] DEBUG adding 'do_work' offset:1 result to result queue
Process-1 [_queue_handler] DEBUG execution of do_work offset:0 ended
Process-3 [_queue_handler] DEBUG adding 'do_work' offset:2 result to result queue
Process-1 [_queue_handler] DEBUG DONE
MainProcess [complete_process] INFO process at offset:0 id:4430 name:Process-1 has completed
Process-2 [_queue_handler] DEBUG execution of do_work offset:1 ended
Process-2 [_queue_handler] DEBUG DONE
MainProcess [complete_process] INFO joining process at offset:0 with id:4430 name:Process-1
Process-3 [_queue_handler] DEBUG execution of do_work offset:2 ended
Process-3 [_queue_handler] DEBUG DONE
MainProcess [process_control_message] INFO the to process queue is empty
MainProcess [complete_process] INFO process at offset:1 id:4431 name:Process-2 has completed
MainProcess [complete_process] INFO joining process at offset:1 with id:4431 name:Process-2
MainProcess [process_control_message] INFO the to process queue is empty
MainProcess [complete_process] INFO process at offset:2 id:4433 name:Process-3 has completed
MainProcess [complete_process] INFO joining process at offset:2 with id:4433 name:Process-3
MainProcess [process_control_message] INFO the to process queue is empty
MainProcess [run] INFO there are no more active processses - quitting
>>> print(f"Total items processed {sum([result for result in results])}")
Total items processed 33

Projects using mpmq

  • mpcurses An abstraction of the Python curses and multiprocessing libraries providing function execution and runtime visualization capabilities

  • mp4ansi A simple ANSI-based terminal emulator that provides multi-processing capabilities.

Development

Clone the repository and ensure the latest version of Docker is installed on your development server.

Build the Docker image:

docker image build \
-t \
mpmq:latest .

Run the Docker container:

docker container run \
--rm \
-it \
-v $PWD:/code \
mpmq:latest \
/bin/bash

Execute the build:

pyb -X

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mpmq-0.3.0.tar.gz (9.5 kB view details)

Uploaded Source

Built Distribution

mpmq-0.3.0-py3-none-any.whl (8.9 kB view details)

Uploaded Python 3

File details

Details for the file mpmq-0.3.0.tar.gz.

File metadata

  • Download URL: mpmq-0.3.0.tar.gz
  • Upload date:
  • Size: 9.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for mpmq-0.3.0.tar.gz
Algorithm Hash digest
SHA256 1750ccf622e303f92bc952d09c8a0cd1d3f795f15979175d63aa3b2a167d8157
MD5 bfd2a67cc2ce3014ec68bb4362092538
BLAKE2b-256 326d9d3251c3774f58acd437afdeedee2ecacc23ca51f4ce2e527adbec04c8c7

See more details on using hashes here.

Provenance

File details

Details for the file mpmq-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: mpmq-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 8.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for mpmq-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 82baef5f34a3e69ce61087e11f58def3550a6c4a8332404ba3fc8284e1291a3a
MD5 07b17497b8e6989556cfd7de0aa94cda
BLAKE2b-256 75351cbdfa34c69e79b5174cd50ec40c59770c82f760c261302f47c39333eefb

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page