Serving with asyncio, multiprocessing, and batching
Project description
mpservice
Utilities for Python concurrency, including
-
Serving with multiprocessing to make full use of multiple cores, and batching to take advantage of vectorized computation if some components of the service have that capability.
One use case is machine learning model serving, although the code is generic and not restricted to this particular use case.
-
Stream processing, i.e. processing a long, possibly infinite stream of input data, with multiple operators in the pipeline. A main use case is that one or more of the operators is I/O bound (think: calling an external service), hence can benefit from concurrency.
The serving and streaming utilities can be combined because a mpservice.mpserver.Server
instance,
while doing heavy-lifting in other processes, acts as an I/O bound operator in the main process.
Indeed, mpservice.mpserver.Server
provides method stream
for using the server to process data streams.
A Server
object could be used either in "embedded" mode or to back a service.
Utilities are provided in mpservice.http
, mpservice.socket
, and mpservice.pipe
for the latter use case.
The package also contains some other related utilities.
To install, do pip install mpservice
.
Status
Production ready. Under active development.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for mpservice-0.11.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b2a7e5c410b1b4e477192a3483ff23f3a3645d56df16be95d8796e6ec195c5ca |
|
MD5 | 514ecf9163f1f96b0e44ec07effe0f06 |
|
BLAKE2b-256 | 8c7b1d4757d821f99a520168c796b4b292083c78de05a9f83b97580ddaf35bee |