Skip to main content

Model Serving made Efficient in the Cloud

Project description

MOSEC

discord invitation link PyPI version conda-forge Python Version PyPi monthly Downloads License Check status

Model Serving made Efficient in the Cloud.

Introduction

MOSEC

Mosec is a high-performance and flexible model serving framework for building ML model-enabled backend and microservices. It bridges the gap between any machine learning models you just trained and the efficient online service API.

  • Highly performant: web layer and task coordination built with Rust 🦀, which offers blazing speed in addition to efficient CPU utilization powered by async I/O
  • Ease of use: user interface purely in Python 🐍, by which users can serve their models in an ML framework-agnostic manner using the same code as they do for offline testing
  • Dynamic batching: aggregate requests from different users for batched inference and distribute results back
  • Pipelined stages: spawn multiple processes for pipelined stages to handle CPU/GPU/IO mixed workloads
  • Cloud friendly: designed to run in the cloud, with the model warmup, graceful shutdown, and Prometheus monitoring metrics, easily managed by Kubernetes or any container orchestration systems
  • Do one thing well: focus on the online serving part, users can pay attention to the model optimization and business logic

Installation

Mosec requires Python 3.7 or above. Install the latest PyPI package for Linux x86_64 or macOS x86_64 with:

pip install -U mosec

To build from the source code, install Rust and run the following command:

make package

You will get a mosec wheel file in the dist folder.

Usage

We demonstrate how Mosec can help you easily host a pre-trained stable diffusion model as a service. You need to install diffusers and transformers as prerequisites:

pip install --upgrade diffusers[torch] transformers

Write the server

Click me for server codes with explanations.

Firstly, we import the libraries and set up a basic logger to better observe what happens.

from io import BytesIO
from typing import List

import torch  # type: ignore
from diffusers import StableDiffusionPipeline  # type: ignore

from mosec import Server, Worker, get_logger
from mosec.mixin import MsgpackMixin

logger = get_logger()

Then, we build an API for clients to query a text prompt and obtain an image based on the stable-diffusion-v1-5 model in just 3 steps.

  1. Define your service as a class which inherits mosec.Worker. Here we also inherit MsgpackMixin to employ the msgpack serialization format(a).

  2. Inside the __init__ method, initialize your model and put it onto the corresponding device. Optionally you can assign self.example with some data to warm up(b) the model. Note that the data should be compatible with your handler's input format, which we detail next.

  3. Override the forward method to write your service handler(c), with the signature forward(self, data: Any | List[Any]) -> Any | List[Any]. Receiving/returning a single item or a tuple depends on whether dynamic batching(d) is configured.

class StableDiffusion(MsgpackMixin, Worker):
    def __init__(self):
        self.pipe = StableDiffusionPipeline.from_pretrained(
            "runwayml/stable-diffusion-v1-5", torch_dtype=torch.float16
        )
        device = "cuda" if torch.cuda.is_available() else "cpu"
        self.pipe = self.pipe.to(device)
        self.example = ["useless example prompt"] * 4  # warmup (batch_size=4)

    def forward(self, data: List[str]) -> List[memoryview]:
        logger.debug("generate images for %s", data)
        res = self.pipe(data)
        logger.debug("NSFW: %s", res[1])
        images = []
        for img in res[0]:
            dummy_file = BytesIO()
            img.save(dummy_file, format="JPEG")
            images.append(dummy_file.getbuffer())
        return images

[!NOTE]

(a) In this example we return an image in the binary format, which JSON does not support (unless encoded with base64 that makes the payload larger). Hence, msgpack suits our need better. If we do not inherit MsgpackMixin, JSON will be used by default. In other words, the protocol of the service request/response can be either msgpack, JSON, or any other format (check our mixins).

(b) Warm-up usually helps to allocate GPU memory in advance. If the warm-up example is specified, the service will only be ready after the example is forwarded through the handler. However, if no example is given, the first request's latency is expected to be longer. The example should be set as a single item or a tuple depending on what forward expects to receive. Moreover, in the case where you want to warm up with multiple different examples, you may set multi_examples (demo here).

(c) This example shows a single-stage service, where the StableDiffusion worker directly takes in client's prompt request and responds the image. Thus the forward can be considered as a complete service handler. However, we can also design a multi-stage service with workers doing different jobs (e.g., downloading images, model inference, post-processing) in a pipeline. In this case, the whole pipeline is considered as the service handler, with the first worker taking in the request and the last worker sending out the response. The data flow between workers is done by inter-process communication.

(d) Since dynamic batching is enabled in this example, the forward method will wishfully receive a list of string, e.g., ['a cute cat playing with a red ball', 'a man sitting in front of a computer', ...], aggregated from different clients for batch inference, improving the system throughput.

Finally, we append the worker to the server to construct a single-stage workflow (multiple stages can be pipelined to further boost the throughput, see this example), and specify the number of processes we want it to run in parallel (num=1), and the maximum batch size (max_batch_size=4, the maximum number of requests dynamic batching will accumulate before timeout; timeout is defined with the max_wait_time=10 in milliseconds, meaning the longest time Mosec waits until sending the batch to the Worker).

if __name__ == "__main__":
    server = Server()
    # 1) `num` specifies the number of processes that will be spawned to run in parallel.
    # 2) By configuring the `max_batch_size` with the value > 1, the input data in your
    # `forward` function will be a list (batch); otherwise, it's a single item.
    server.append_worker(StableDiffusion, num=1, max_batch_size=4, max_wait_time=10)
    server.run()

Run the server

Click me to see how to run and query the server.

The above snippets are merged in our example file. You may directly run at the project root level. We first have a look at the command line arguments (explanations here):

python examples/stable_diffusion/server.py --help

Then let's start the server with debug logs:

python examples/stable_diffusion/server.py --log-level debug --timeout 30000

Open http://127.0.0.1:8000/openapi/swagger/ in your browser to get the OpenAPI doc.

And in another terminal, test it:

python examples/stable_diffusion/client.py --prompt "a cute cat playing with a red ball" --output cat.jpg --port 8000

You will get an image named "cat.jpg" in the current directory.

You can check the metrics:

curl http://127.0.0.1:8000/metrics

That's it! You have just hosted your stable-diffusion model as a service! 😉

Examples

More ready-to-use examples can be found in the Example section. It includes:

Configuration

  • Dynamic batching
    • max_batch_size and max_wait_time (millisecond) are configured when you call append_worker.
    • Make sure inference with the max_batch_size value won't cause the out-of-memory in GPU.
    • Normally, max_wait_time should be less than the batch inference time.
    • If enabled, it will collect a batch either when the number of accumulated requests reaches max_batch_size or when max_wait_time has elapsed. The service will benefit from this feature when the traffic is high.
  • Check the arguments doc for other configurations.

Deployment

  • If you're looking for a GPU base image with mosec installed, you can check the official image mosecorg/mosec. For the complex use case, check out envd.
  • This service doesn't need Gunicorn or NGINX, but you can certainly use the ingress controller when necessary.
  • This service should be the PID 1 process in the container since it controls multiple processes. If you need to run multiple processes in one container, you will need a supervisor. You may choose Supervisor or Horust.
  • Remember to collect the metrics.
    • mosec_service_batch_size_bucket shows the batch size distribution.
    • mosec_service_batch_duration_second_bucket shows the duration of dynamic batching for each connection in each stage (starts from receiving the first task).
    • mosec_service_process_duration_second_bucket shows the duration of processing for each connection in each stage (including the IPC time but excluding the mosec_service_batch_duration_second_bucket).
    • mosec_service_remaining_task shows the number of currently processing tasks.
    • mosec_service_throughput shows the service throughput.
  • Stop the service with SIGINT (CTRL+C) or SIGTERM (kill {PID}) since it has the graceful shutdown logic.

Performance tuning

  • Find out the best max_batch_size and max_wait_time for your inference service. The metrics will show the histograms of the real batch size and batch duration. Those are the key information to adjust these two parameters.
  • Try to split the whole inference process into separate CPU and GPU stages (ref DistilBERT). Different stages will be run in a data pipeline, which will keep the GPU busy.
  • You can also adjust the number of workers in each stage. For example, if your pipeline consists of a CPU stage for preprocessing and a GPU stage for model inference, increasing the number of CPU-stage workers can help to produce more data to be batched for model inference at the GPU stage; increasing the GPU-stage workers can fully utilize the GPU memory and computation power. Both ways may contribute to higher GPU utilization, which consequently results in higher service throughput.
  • For multi-stage services, note that the data passing through different stages will be serialized/deserialized by the serialize_ipc/deserialize_ipc methods, so extremely large data might make the whole pipeline slow. The serialized data is passed to the next stage through rust by default, you could enable shared memory to potentially reduce the latency (ref RedisShmIPCMixin).
  • You should choose appropriate serialize/deserialize methods, which are used to decode the user request and encode the response. By default, both are using JSON. However, images and embeddings are not well supported by JSON. You can choose msgpack which is faster and binary compatible (ref Stable Diffusion).
  • Configure the threads for OpenBLAS or MKL. It might not be able to choose the most suitable CPUs used by the current Python process. You can configure it for each worker by using the env (ref custom GPU allocation).

Adopters

Here are some of the companies and individual users that are using Mosec:

Citation

If you find this software useful for your research, please consider citing

@software{yang2021mosec,
  title = {{MOSEC: Model Serving made Efficient in the Cloud}},
  author = {Yang, Keming and Liu, Zichen and Cheng, Philip},
  url = {https://github.com/mosecorg/mosec},
  year = {2021}
}

Contributing

We welcome any kind of contribution. Please give us feedback by raising issues or discussing on Discord. You could also directly contribute your code and pull request!

To start develop, you can use envd to create an isolated and clean Python & Rust environment. Check the envd-docs or build.envd for more information.

Project details


Release history Release notifications | RSS feed

This version

0.8.3

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mosec-0.8.3.tar.gz (84.4 kB view details)

Uploaded Source

Built Distributions

mosec-0.8.3-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB view details)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

mosec-0.8.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded PyPy macOS 10.9+ x86-64

mosec-0.8.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB view details)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

mosec-0.8.3-pp39-pypy39_pp73-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded PyPy macOS 10.9+ x86-64

mosec-0.8.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB view details)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

mosec-0.8.3-pp38-pypy38_pp73-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded PyPy macOS 10.9+ x86-64

mosec-0.8.3-cp312-cp312-musllinux_1_1_x86_64.whl (6.0 MB view details)

Uploaded CPython 3.12 musllinux: musl 1.1+ x86-64

mosec-0.8.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB view details)

Uploaded CPython 3.12 manylinux: glibc 2.17+ x86-64

mosec-0.8.3-cp312-cp312-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.12 macOS 10.9+ x86-64

mosec-0.8.3-cp311-cp311-musllinux_1_1_x86_64.whl (6.0 MB view details)

Uploaded CPython 3.11 musllinux: musl 1.1+ x86-64

mosec-0.8.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

mosec-0.8.3-cp311-cp311-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.11 macOS 10.9+ x86-64

mosec-0.8.3-cp310-cp310-musllinux_1_1_x86_64.whl (6.0 MB view details)

Uploaded CPython 3.10 musllinux: musl 1.1+ x86-64

mosec-0.8.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

mosec-0.8.3-cp310-cp310-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

mosec-0.8.3-cp39-cp39-musllinux_1_1_x86_64.whl (6.0 MB view details)

Uploaded CPython 3.9 musllinux: musl 1.1+ x86-64

mosec-0.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

mosec-0.8.3-cp39-cp39-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

mosec-0.8.3-cp38-cp38-musllinux_1_1_x86_64.whl (6.0 MB view details)

Uploaded CPython 3.8 musllinux: musl 1.1+ x86-64

mosec-0.8.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

mosec-0.8.3-cp38-cp38-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

File details

Details for the file mosec-0.8.3.tar.gz.

File metadata

  • Download URL: mosec-0.8.3.tar.gz
  • Upload date:
  • Size: 84.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for mosec-0.8.3.tar.gz
Algorithm Hash digest
SHA256 d65993830cdc87ac97aee4455146cddd20b88ccaac8488e3ad46e81cab0d6b78
MD5 38f4dbb5a7e5d9f4a15398b680755793
BLAKE2b-256 336eeced721f51cc1d4d2f8115b1d94c8544d2d7c3145bf51d062aaacb9a722e

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4a4648c73058a83a616974d95b2ccfaf20454e2e73391dbbd333c7bd75283fcb
MD5 12990119cec18c85f9d6c74f355aca2f
BLAKE2b-256 fcd26522400290d25151ac5fcd343d4617aa9b9071c3745577f042470d486a3f

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 de7de9f9954b71f8ba489e8bb0584b97d4b0d9c1c9325a0105f3b3a3c1d4824f
MD5 19bd031da6ba87ac989a0b2ef4a9e830
BLAKE2b-256 2b426b5dee5db215b938f5d6ea802d260a39e4557939a3891d0c151954f8eb78

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 531de0b8f7234d6b6e05f1f67f473bc721d62505e1a1f933d24f48e8726388d2
MD5 b1d8c1d95b504254e3c8a84974420673
BLAKE2b-256 e8749146341e7cee4a1bd07bf021f7a260ade99fe24fc360be60ff766bb9e15c

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-pp39-pypy39_pp73-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-pp39-pypy39_pp73-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 da72c695eceed9001d6423bc224ec3812b4ed7a58355f48a4f002f5a2cb47f1d
MD5 91816851e3ddf20628f6c307eff2da7a
BLAKE2b-256 997035b3937849ad399dd36c13d1e25634bd22e9dde1a9e36a2b680f8443fbd8

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 45d18ac20c6dc9359869c35763c45a8664de2251d7086d7c349972fd22f7064c
MD5 389dbfde418f62eb37cbb5a160a5e799
BLAKE2b-256 0f2d17fe0a10c585965cc903b57b710467c9958e46cdc96bf0d142236f878c3d

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-pp38-pypy38_pp73-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-pp38-pypy38_pp73-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 fa3b84c63e669fae5338d9e17b2f7377e2cea7d0af8b4723ce021d62ba08d20c
MD5 99a54f22c5e3bd550b396ae1fdb7b7dc
BLAKE2b-256 cb97e50b470b1223ee5a0a0e7e578aca18d50bdeeb486f9bbd4ae90e9e9ff004

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp312-cp312-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp312-cp312-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 d995b2cc4a2ea736110c8890eaebd6c074431e4b36eb0d004a4b21644cb19707
MD5 6920d64042f4ee67ef98fce13013553b
BLAKE2b-256 582b999a33b4bf0ba8ca1798a50ab2248c593a96919a054959ea236c9519103e

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 6e871d6982a077814c5a5834d6f3d9ba37a8274275fd8a5277ed40a71e8f031d
MD5 7e21bc120739f6a3d8b3804af06e8959
BLAKE2b-256 c532818c859ac9f4877ccb489df91b1abef11bfa2bfdd6fdd12369ab322d6950

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp312-cp312-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp312-cp312-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 5e9a9627a532e0242825e1297b3c424df75fa5ae3d8ba5504656b923c7088e62
MD5 11044034c174e62e3ca5963db4e54c39
BLAKE2b-256 c8f2849c946b3fd721b2b1ff8e47b1d375534d5ea08e3bb415b688df20337053

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp311-cp311-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp311-cp311-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 55ca1115c7b373fea4b1295b08c2f448d23befcb775429245ddc36bb5ff2da09
MD5 6413def9a13f46e8f6040ec953401883
BLAKE2b-256 0f2b4653327f750746f510b396027df1cee7855b1c130970a18aac2e3a6f4811

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 7af8b7ddf7ad48912d616e23c8f1c512260702f7bd8cddff9b4a6de6b745b11c
MD5 8cf94544ea43886b9abacf45c9b452bb
BLAKE2b-256 794dd538de142da4c7983405fc4504eb5a96c1da8e9fd7f389afc240fe47806a

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 241a957766164fc43c77fa03efd6133571d53f9fee829935de9ffff7c3580a6a
MD5 b41457eee892b8ec361c0baa8ae10000
BLAKE2b-256 9453ab4186993c1d0f9ef802703cb0cecf7728212513ce13c0496a233b67cbab

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp310-cp310-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp310-cp310-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 0552b4e4a852cc144865d5abe9a71c62cdac241f73af86c8c7ea09199ea27702
MD5 371c38ce3323e914aa2f4eaaa28b9ea9
BLAKE2b-256 729c8594949e03ad7ee066de3d523f8f30062e109b70d3269d5cb11b62c6b9fc

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f85438a4bd3c3f83cdcd14a9af5def56328de3d44798548ecf11729ab61ecc31
MD5 53d875b6d1c3132a074854b0b116adf4
BLAKE2b-256 0554ddff02e2a6262c6c2f7aa7ee338fc22197c054529d6950a9ea90d0ec28f4

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 bf5072591b564867b3cd392131a059440b4ec25de9bb17bf7a5a02de566f4884
MD5 855a7814bd7514153972e7c85ebbd998
BLAKE2b-256 28257264bd80672cea58d89eb4e25b988df5acbfde02199be36886d312e12f22

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp39-cp39-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp39-cp39-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 ff87e68ad879bdec92ea589702495193c7b41c4ab27b4f7d83df6c40ebb7be56
MD5 8efd081b6e310fb23bca12577079a183
BLAKE2b-256 a2d9ba5c14fb9c2202248d2f7fe677edc854a0cf9842fd5ec2a9073a3f0e56fe

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d83bcc0edd6226e3bdfbd884c1b3c258d378b038ae9b58cea422fc92eed8c2bc
MD5 0daebe86f5b91feae75f4f51b2c37400
BLAKE2b-256 c11b8f1adffda101e189bad2c1b087d9499da4070b1a2fce616de379f6803ee4

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 e9e0d180e7280210c0bc5550fefad0ce68aa46103e6194bae8bda97c266508cd
MD5 437060804166b310ce9ef816380c7b0f
BLAKE2b-256 64a745611732afd68d329ba4fb9e0dc5fec3bfdfc78585370a598303f9f1f372

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp38-cp38-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp38-cp38-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 38f48b2b5133fca26986e3fa2aa2612db440a485947f35935ff0069f8588947f
MD5 206770ac1bd5effac9343d88563f8304
BLAKE2b-256 ace4c258d7bd8492337fc7cb0408187b1b94da9c5a4109b917e1f02cf6e1f7b4

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 35ad0130a6aee93fa31b1ce05d50e87a9fa8bb2a20941d4c4cf2942dfb6d9fda
MD5 7c62e4c5d1b82b05dc08d139bf232c3b
BLAKE2b-256 6a663827e49e4c5661f44d04fe17e2104068001ddb7f5e4c1fc4d39815008899

See more details on using hashes here.

File details

Details for the file mosec-0.8.3-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.3-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 8f9642838c4710be4d7609c563a434098c33087941d920c1a3e21168a636b392
MD5 2e6797d7001a72633fbb2c20597baf57
BLAKE2b-256 32ff25114095b1629f06dabd6bb43c7247aa2a5419373b89d35af7176461486d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page