Skip to main content

Model Serving made Efficient in the Cloud

Project description

MOSEC

discord invitation link PyPI version conda-forge Python Version PyPi monthly Downloads License Check status

Model Serving made Efficient in the Cloud.

Introduction

MOSEC

Mosec is a high-performance and flexible model serving framework for building ML model-enabled backend and microservices. It bridges the gap between any machine learning models you just trained and the efficient online service API.

  • Highly performant: web layer and task coordination built with Rust 🦀, which offers blazing speed in addition to efficient CPU utilization powered by async I/O
  • Ease of use: user interface purely in Python 🐍, by which users can serve their models in an ML framework-agnostic manner using the same code as they do for offline testing
  • Dynamic batching: aggregate requests from different users for batched inference and distribute results back
  • Pipelined stages: spawn multiple processes for pipelined stages to handle CPU/GPU/IO mixed workloads
  • Cloud friendly: designed to run in the cloud, with the model warmup, graceful shutdown, and Prometheus monitoring metrics, easily managed by Kubernetes or any container orchestration systems
  • Do one thing well: focus on the online serving part, users can pay attention to the model optimization and business logic

Installation

Mosec requires Python 3.7 or above. Install the latest PyPI package for Linux x86_64 or macOS x86_64 with:

pip install -U mosec

To build from the source code, install Rust and run the following command:

make package

You will get a mosec wheel file in the dist folder.

Usage

We demonstrate how Mosec can help you easily host a pre-trained stable diffusion model as a service. You need to install diffusers and transformers as prerequisites:

pip install --upgrade diffusers[torch] transformers

Write the server

Click me for server codes with explanations.

Firstly, we import the libraries and set up a basic logger to better observe what happens.

from io import BytesIO
from typing import List

import torch  # type: ignore
from diffusers import StableDiffusionPipeline  # type: ignore

from mosec import Server, Worker, get_logger
from mosec.mixin import MsgpackMixin

logger = get_logger()

Then, we build an API for clients to query a text prompt and obtain an image based on the stable-diffusion-v1-5 model in just 3 steps.

  1. Define your service as a class which inherits mosec.Worker. Here we also inherit MsgpackMixin to employ the msgpack serialization format(a).

  2. Inside the __init__ method, initialize your model and put it onto the corresponding device. Optionally you can assign self.example with some data to warm up(b) the model. Note that the data should be compatible with your handler's input format, which we detail next.

  3. Override the forward method to write your service handler(c), with the signature forward(self, data: Any | List[Any]) -> Any | List[Any]. Receiving/returning a single item or a tuple depends on whether dynamic batching(d) is configured.

class StableDiffusion(MsgpackMixin, Worker):
    def __init__(self):
        self.pipe = StableDiffusionPipeline.from_pretrained(
            "runwayml/stable-diffusion-v1-5", torch_dtype=torch.float16
        )
        device = "cuda" if torch.cuda.is_available() else "cpu"
        self.pipe = self.pipe.to(device)
        self.example = ["useless example prompt"] * 4  # warmup (batch_size=4)

    def forward(self, data: List[str]) -> List[memoryview]:
        logger.debug("generate images for %s", data)
        res = self.pipe(data)
        logger.debug("NSFW: %s", res[1])
        images = []
        for img in res[0]:
            dummy_file = BytesIO()
            img.save(dummy_file, format="JPEG")
            images.append(dummy_file.getbuffer())
        return images

[!NOTE]

(a) In this example we return an image in the binary format, which JSON does not support (unless encoded with base64 that makes the payload larger). Hence, msgpack suits our need better. If we do not inherit MsgpackMixin, JSON will be used by default. In other words, the protocol of the service request/response can be either msgpack, JSON, or any other format (check our mixins).

(b) Warm-up usually helps to allocate GPU memory in advance. If the warm-up example is specified, the service will only be ready after the example is forwarded through the handler. However, if no example is given, the first request's latency is expected to be longer. The example should be set as a single item or a tuple depending on what forward expects to receive. Moreover, in the case where you want to warm up with multiple different examples, you may set multi_examples (demo here).

(c) This example shows a single-stage service, where the StableDiffusion worker directly takes in client's prompt request and responds the image. Thus the forward can be considered as a complete service handler. However, we can also design a multi-stage service with workers doing different jobs (e.g., downloading images, model inference, post-processing) in a pipeline. In this case, the whole pipeline is considered as the service handler, with the first worker taking in the request and the last worker sending out the response. The data flow between workers is done by inter-process communication.

(d) Since dynamic batching is enabled in this example, the forward method will wishfully receive a list of string, e.g., ['a cute cat playing with a red ball', 'a man sitting in front of a computer', ...], aggregated from different clients for batch inference, improving the system throughput.

Finally, we append the worker to the server to construct a single-stage workflow (multiple stages can be pipelined to further boost the throughput, see this example), and specify the number of processes we want it to run in parallel (num=1), and the maximum batch size (max_batch_size=4, the maximum number of requests dynamic batching will accumulate before timeout; timeout is defined with the max_wait_time=10 in milliseconds, meaning the longest time Mosec waits until sending the batch to the Worker).

if __name__ == "__main__":
    server = Server()
    # 1) `num` specifies the number of processes that will be spawned to run in parallel.
    # 2) By configuring the `max_batch_size` with the value > 1, the input data in your
    # `forward` function will be a list (batch); otherwise, it's a single item.
    server.append_worker(StableDiffusion, num=1, max_batch_size=4, max_wait_time=10)
    server.run()

Run the server

Click me to see how to run and query the server.

The above snippets are merged in our example file. You may directly run at the project root level. We first have a look at the command line arguments (explanations here):

python examples/stable_diffusion/server.py --help

Then let's start the server with debug logs:

python examples/stable_diffusion/server.py --log-level debug --timeout 30000

Open http://127.0.0.1:8000/openapi/swagger/ in your browser to get the OpenAPI doc.

And in another terminal, test it:

python examples/stable_diffusion/client.py --prompt "a cute cat playing with a red ball" --output cat.jpg --port 8000

You will get an image named "cat.jpg" in the current directory.

You can check the metrics:

curl http://127.0.0.1:8000/metrics

That's it! You have just hosted your stable-diffusion model as a service! 😉

Examples

More ready-to-use examples can be found in the Example section. It includes:

Configuration

  • Dynamic batching
    • max_batch_size and max_wait_time (millisecond) are configured when you call append_worker.
    • Make sure inference with the max_batch_size value won't cause the out-of-memory in GPU.
    • Normally, max_wait_time should be less than the batch inference time.
    • If enabled, it will collect a batch either when the number of accumulated requests reaches max_batch_size or when max_wait_time has elapsed. The service will benefit from this feature when the traffic is high.
  • Check the arguments doc for other configurations.

Deployment

  • If you're looking for a GPU base image with mosec installed, you can check the official image mosecorg/mosec. For the complex use case, check out envd.
  • This service doesn't need Gunicorn or NGINX, but you can certainly use the ingress controller when necessary.
  • This service should be the PID 1 process in the container since it controls multiple processes. If you need to run multiple processes in one container, you will need a supervisor. You may choose Supervisor or Horust.
  • Remember to collect the metrics.
    • mosec_service_batch_size_bucket shows the batch size distribution.
    • mosec_service_batch_duration_second_bucket shows the duration of dynamic batching for each connection in each stage (starts from receiving the first task).
    • mosec_service_process_duration_second_bucket shows the duration of processing for each connection in each stage (including the IPC time but excluding the mosec_service_batch_duration_second_bucket).
    • mosec_service_remaining_task shows the number of currently processing tasks.
    • mosec_service_throughput shows the service throughput.
  • Stop the service with SIGINT (CTRL+C) or SIGTERM (kill {PID}) since it has the graceful shutdown logic.

Performance tuning

  • Find out the best max_batch_size and max_wait_time for your inference service. The metrics will show the histograms of the real batch size and batch duration. Those are the key information to adjust these two parameters.
  • Try to split the whole inference process into separate CPU and GPU stages (ref DistilBERT). Different stages will be run in a data pipeline, which will keep the GPU busy.
  • You can also adjust the number of workers in each stage. For example, if your pipeline consists of a CPU stage for preprocessing and a GPU stage for model inference, increasing the number of CPU-stage workers can help to produce more data to be batched for model inference at the GPU stage; increasing the GPU-stage workers can fully utilize the GPU memory and computation power. Both ways may contribute to higher GPU utilization, which consequently results in higher service throughput.
  • For multi-stage services, note that the data passing through different stages will be serialized/deserialized by the serialize_ipc/deserialize_ipc methods, so extremely large data might make the whole pipeline slow. The serialized data is passed to the next stage through rust by default, you could enable shared memory to potentially reduce the latency (ref RedisShmIPCMixin).
  • You should choose appropriate serialize/deserialize methods, which are used to decode the user request and encode the response. By default, both are using JSON. However, images and embeddings are not well supported by JSON. You can choose msgpack which is faster and binary compatible (ref Stable Diffusion).
  • Configure the threads for OpenBLAS or MKL. It might not be able to choose the most suitable CPUs used by the current Python process. You can configure it for each worker by using the env (ref custom GPU allocation).

Adopters

Here are some of the companies and individual users that are using Mosec:

Citation

If you find this software useful for your research, please consider citing

@software{yang2021mosec,
  title = {{MOSEC: Model Serving made Efficient in the Cloud}},
  author = {Yang, Keming and Liu, Zichen and Cheng, Philip},
  url = {https://github.com/mosecorg/mosec},
  year = {2021}
}

Contributing

We welcome any kind of contribution. Please give us feedback by raising issues or discussing on Discord. You could also directly contribute your code and pull request!

To start develop, you can use envd to create an isolated and clean Python & Rust environment. Check the envd-docs or build.envd for more information.

Project details


Release history Release notifications | RSS feed

This version

0.8.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mosec-0.8.2.tar.gz (84.0 kB view details)

Uploaded Source

Built Distributions

mosec-0.8.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.7 MB view details)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

mosec-0.8.2-pp310-pypy310_pp73-macosx_10_9_x86_64.whl (4.7 MB view details)

Uploaded PyPy macOS 10.9+ x86-64

mosec-0.8.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.7 MB view details)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

mosec-0.8.2-pp39-pypy39_pp73-macosx_10_9_x86_64.whl (4.7 MB view details)

Uploaded PyPy macOS 10.9+ x86-64

mosec-0.8.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.7 MB view details)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

mosec-0.8.2-pp38-pypy38_pp73-macosx_10_9_x86_64.whl (4.7 MB view details)

Uploaded PyPy macOS 10.9+ x86-64

mosec-0.8.2-cp312-cp312-musllinux_1_1_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.12 musllinux: musl 1.1+ x86-64

mosec-0.8.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.12 manylinux: glibc 2.17+ x86-64

mosec-0.8.2-cp312-cp312-macosx_10_9_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.12 macOS 10.9+ x86-64

mosec-0.8.2-cp311-cp311-musllinux_1_1_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.11 musllinux: musl 1.1+ x86-64

mosec-0.8.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

mosec-0.8.2-cp311-cp311-macosx_10_9_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.11 macOS 10.9+ x86-64

mosec-0.8.2-cp310-cp310-musllinux_1_1_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.10 musllinux: musl 1.1+ x86-64

mosec-0.8.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

mosec-0.8.2-cp310-cp310-macosx_10_9_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

mosec-0.8.2-cp39-cp39-musllinux_1_1_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.9 musllinux: musl 1.1+ x86-64

mosec-0.8.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

mosec-0.8.2-cp39-cp39-macosx_10_9_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

mosec-0.8.2-cp38-cp38-musllinux_1_1_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.8 musllinux: musl 1.1+ x86-64

mosec-0.8.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

mosec-0.8.2-cp38-cp38-macosx_10_9_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

File details

Details for the file mosec-0.8.2.tar.gz.

File metadata

  • Download URL: mosec-0.8.2.tar.gz
  • Upload date:
  • Size: 84.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for mosec-0.8.2.tar.gz
Algorithm Hash digest
SHA256 bf3dc1b78b7f7f92cc0f59eeaf018f3f21bea999a7cbcf1382a1479feeb1f0d8
MD5 0d4d69535311a2a57dca331006af80a8
BLAKE2b-256 52529eaedc4a87c57fa22ae1c81a1c5815f2be326b1334747b926dff287b4f71

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f28a37fcc161d3706da7cfd780b84c7049236d018adaf2a8e37ad153078d2c23
MD5 9e21922975e7097996f418d51dd66078
BLAKE2b-256 4143c763df13b2f208364b5f39b220fa7e75b8d410fe4db1059371387c47c3a4

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-pp310-pypy310_pp73-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-pp310-pypy310_pp73-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 557669255e77ac2228dff645c6a90635f1b6fee0468a13f8f4f794aaefbd290e
MD5 045114b7cbfc1f9374b74793525e373f
BLAKE2b-256 8448d6a33cacf318d0bcc6904248daeaf273a51ed000869f65a4d1676687e1bc

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 73fd31e095c3538b3bfa807c3ce42f37a54ce75f78923799d6f03e48286e39d2
MD5 a8d443bb4bdfe5caa9de83d73b71933a
BLAKE2b-256 3408cd0d3b03b6cbad3a521bd7d3867cf214f7654cf9b57ea9569f6bcdcf968d

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-pp39-pypy39_pp73-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-pp39-pypy39_pp73-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 e555c9bb866be6d0f31f0ced3345fdb8c39682fd6652034634cf5f98155060ef
MD5 8dc0c4a4930291b842c5f3844a16d9ae
BLAKE2b-256 10a5245de14935e9cdf55ce9b8eb9e6303264c77714caec5e25c6db1c6c7000e

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 55708e2b1f2ad43c4bde22b140af198b43bcd42ff70a2d7dbfe4cb2abe6d3390
MD5 efde2d82bcef75d854a4a1a30042c147
BLAKE2b-256 03059a32de4ca62b185f3b3a2e51ea0952025eadef6d3f8c21407dd301d9fcdd

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-pp38-pypy38_pp73-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-pp38-pypy38_pp73-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 c92a1d33bd676ed34f421cf7061e898eae709bbc9f3fc2537ef97a5ca10c870b
MD5 48a34709e61d38ce797b8d0f4666f72a
BLAKE2b-256 ff5e1dcbfb6e0d74b62d13cd1965eeee6de9737a0bfee1d83025c3abbb235367

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp312-cp312-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp312-cp312-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 70820ff7e4330cb9561ffce06be82c126c351109e956197e30ef2bd139339310
MD5 e92948b8b4a729524ccd5c3ffbaea121
BLAKE2b-256 41b34891cea8bd79fc12fb533c73312a0faa450fd0c1190966ed5b7cbcc58d73

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 97fdd11a04f5696f71852d16569673bc0773058c8b5f2f830b1428f78746ba80
MD5 435c8e4ffe896d775b31dc626910862d
BLAKE2b-256 c9ee1d5c5d033d1bb96aa04b9f368223fcbf43fca7db0ee00b1e6dda879c788c

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp312-cp312-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp312-cp312-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 6e6076e19bacfec015580d6e52cae2dc3063410e045f1f4df559a0855b9e9156
MD5 0683be137e5d506d8a4370f4f0d3e351
BLAKE2b-256 d415211563adf53320e04a4db65301899ec9cd8d1532f5b3fefd5f40717eee33

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp311-cp311-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp311-cp311-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 eac86ef6d6cc2c9832c9d4ce7d762454f9142d8ed498ce4a0fd5c58256f3fea9
MD5 7160dabe0b569b52451b34ae090f9939
BLAKE2b-256 645138750177fb0cd33c54214a3944d887b702ea4725d6a313ea59b24e5b06f0

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 dced0c2429de302942f85b773c7a7dfa065eeee36bab5b848166f7fc6a7fc672
MD5 aec8289502f982992a10689e66cf63c9
BLAKE2b-256 1cbc31d9bcc82abde54f26697c8229932b457ccbdaa1acda3d0978013cef7e3a

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 09521c6ed83563a1c4f66d6add50ade3ffe74d7ab9628ff273d538a657814597
MD5 4f66443fa946e762315a8564d61d1706
BLAKE2b-256 493aae40f5018ab542154e5d32bcae09ce5ba8bcdf6e0ee2a3abc78a488452ef

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp310-cp310-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp310-cp310-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 5d83dd82d5018cacf22c64f5288341dbbef26da8ba39515637c8517babb51bc8
MD5 45534cfa2731c08143cda30189ad7554
BLAKE2b-256 cb3eb0df4feb004b4864173dc9a338462307408bf892bd6df87d49c164774774

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 0a51c2a904552dd0ff43a9c94fe0ec08051564ba21652eb3c325d1a1e2783e27
MD5 5157e05b45fd8c6daa02ce32245d7d24
BLAKE2b-256 7896aef5c66079d2473685fc8d342422e96137b7ad1beff54730c476868779a0

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 c4c7975d304062255d753e570084a79a1ef77e0f436a9f3e2379a4f2de0cf022
MD5 f6999ac8d01d28e836a1e936f7092dee
BLAKE2b-256 c690674b03fc961425445c607984500af667e2689d84aee1ddea720cacfda888

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp39-cp39-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp39-cp39-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 f98730edd185d084f5aef336eced476544b2492b79f89e2556d0698cfcbcc7ed
MD5 9ac80f666c17aa9d3f93bcf82e8a8dea
BLAKE2b-256 fa9a64f96b96de0823415d5e87ecf65abb5c3eeb34e3709d78f0b1830b9234b5

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 1d16cbfeb80f215911e28e06f21308fbdf27c302c89cbc59c29e23c37f941f3d
MD5 8600c6064200a02d3ceb87feebe9b35a
BLAKE2b-256 08b660b7794a4ab243ee1cb5227577a964c80ceefe1ed4bbdeab8ab647d6a658

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 e0320f9ff6b11ce061b882262f578163fc6ace025c3215f0406c36617e89a334
MD5 a8f5809f7072f9caa16d46b78d6f42dc
BLAKE2b-256 a4edab89843f5082e092d88316f62af7a82be08c149d7b7c1b8ebb570297f142

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp38-cp38-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp38-cp38-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 51289a0671b0a5cd43e3396c8ba564915788d0968e1ecdb8c52592c4ad77591e
MD5 ee5fa13ec777ada21cfcdd434fdf9225
BLAKE2b-256 803094d61660e699628d95c0fb3f98f5a5cc22856f6904e128d01cbc4ad0d961

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 3250ac6dae85709cd4c3bbdc973f0ceff42e7acacb968e64bfb8eeb467927bf1
MD5 8642afa10743f9530796e1b6788a8bd0
BLAKE2b-256 88c4cc36243a178ab1cbe9ed6521cd427662e48d96232343a973e960d3bb168c

See more details on using hashes here.

File details

Details for the file mosec-0.8.2-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.2-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 b9c4dd392fa49afa9b690d422e98980fffda253c75e413f0eab0bc2394b39301
MD5 c0c969261488a0bd869a0e8e13118f45
BLAKE2b-256 2fae9562b6f80239b2946e6743ed3c8bd4c91b9f8055e4be81d752d215069fea

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page