Skip to main content

Model Serving made Efficient in the Cloud

Reason this release was yanked:

deadlock in nested function call

Project description

MOSEC

discord invitation link PyPI version Python Version PyPi Downloads License Check status

Model Serving made Efficient in the Cloud.

Introduction

Mosec is a high-performance and flexible model serving framework for building ML model-enabled backend and microservices. It bridges the gap between any machine learning models you just trained and the efficient online service API.

  • Highly performant: web layer and task coordination built with Rust 🦀, which offers blazing speed in addition to efficient CPU utilization powered by async I/O
  • Ease of use: user interface purely in Python 🐍, by which users can serve their models in an ML framework-agnostic manner using the same code as they do for offline testing
  • Dynamic batching: aggregate requests from different users for batched inference and distribute results back
  • Pipelined stages: spawn multiple processes for pipelined stages to handle CPU/GPU/IO mixed workloads
  • Cloud friendly: designed to run in the cloud, with the model warmup, graceful shutdown, and Prometheus monitoring metrics, easily managed by Kubernetes or any container orchestration systems
  • Do one thing well: focus on the online serving part, users can pay attention to the model optimization and business logic

Installation

Mosec requires Python 3.7 or above. Install the latest PyPI package with:

pip install -U mosec

Usage

We demonstrate how Mosec can help you easily host a pre-trained stable diffusion model as a service. You need to install diffusers and transformers as prerequisites:

pip install --upgrade diffusers[torch] transformers

Write the server

Firstly, we import the libraries and set up a basic logger to better observe what happens.

from io import BytesIO
from typing import List

import torch  # type: ignore
from diffusers import StableDiffusionPipeline  # type: ignore

from mosec import Server, Worker, get_logger
from mosec.mixin import MsgpackMixin

logger = get_logger()

Then, we build an API for clients to query a text prompt and obtain an image based on the stable-diffusion-v1-5 model in just 3 steps.

  1. Define your service as a class which inherits mosec.Worker. Here we also inherit MsgpackMixin to employ the msgpack serialization format(a).

  2. Inside the __init__ method, initialize your model and put it onto the corresponding device. Optionally you can assign self.example with some data to warm up(b) the model. Note that the data should be compatible with your handler's input format, which we detail next.

  3. Override the forward method to write your service handler(c), with the signature forward(self, data: Any | List[Any]) -> Any | List[Any]. Receiving/returning a single item or a tuple depends on whether dynamic batching(d) is configured.

class StableDiffusion(MsgpackMixin, Worker):
    def __init__(self):
        self.pipe = StableDiffusionPipeline.from_pretrained(
            "runwayml/stable-diffusion-v1-5", torch_dtype=torch.float16
        )
        device = "cuda" if torch.cuda.is_available() else "cpu"
        self.pipe = self.pipe.to(device)
        self.example = ["useless example prompt"] * 4  # warmup (bs=4)

    def forward(self, data: List[str]) -> List[memoryview]:
        logger.debug("generate images for %s", data)
        res = self.pipe(data)
        logger.debug("NSFW: %s", res[1])
        images = []
        for img in res[0]:
            dummy_file = BytesIO()
            img.save(dummy_file, format="JPEG")
            images.append(dummy_file.getbuffer())
        return images

Note

(a) In this example we return an image in the binary format, which JSON does not support (unless encoded with base64 that makes it longer). Hence, msgpack suits our need better. If we do not inherit MsgpackMixin, JSON will be used by default. In other words, the protocol of the service request/response can either be msgpack or JSON.

(b) Warm-up usually helps to allocate GPU memory in advance. If the warm-up example is specified, the service will only be ready after the example is forwarded through the handler. However, if no example is given, the first request's latency is expected to be longer. The example should be set as a single item or a tuple depending on what forward expects to receive. Moreover, in the case where you want to warm up with multiple different examples, you may set multi_examples (demo here).

(c) This example shows a single-stage service, where the StableDiffusion worker directly takes in client's prompt request and responds the image. Thus the forward can be considered as a complete service handler. However, we can also design a multi-stage service with workers doing different jobs (e.g., downloading images, forward model, post-processing) in a pipeline. In this case, the whole pipeline is considered as the service handler, with the first worker taking in the request and the last worker sending out the response. The data flow between workers is done by inter-process communication.

(d) Since dynamic batching is enabled in this example, the forward method will wishfully receive a list of string, e.g., ['a cute cat playing with a red ball', 'a man sitting in front of a computer', ...], aggregated from different clients for batch inference, improving the system throughput.

Finally, we append the worker to the server to construct a single-stage workflow (multiple stages can be pipelined to further boost the throughput, see this example), and specify the number of processes we want it to run in parallel (num=1), and the maximum batch size (max_batch_size=4, the maximum number of requests dynamic batching will accumulate before timeout; timeout is defined with the flag --wait in milliseconds, meaning the longest time Mosec waits until sending the batch to the Worker).

if __name__ == "__main__":
    server = Server()
    # 1) `num` specify the number of processes that will be spawned to run in parallel.
    # 2) By configuring the `max_batch_size` with the value > 1, the input data in your
    # `forward` function will be a list (batch); otherwise, it's a single item.
    server.append_worker(StableDiffusion, num=1, max_batch_size=4, max_wait_time=10)
    server.run()

Run the server

The above snippets are merged in our example file. You may directly run at the project root level. We first have a look at the command line arguments (explanations here):

python examples/stable_diffusion/server.py --help

Then let's start the server with debug logs:

python examples/stable_diffusion/server.py --debug

And in another terminal, test it:

python examples/stable_diffusion/client.py --prompt "a cute cat playing with a red ball" --output cat.jpg --port 8000

You will get an image named "cat.jpg" in the current directory.

You can check the metrics:

curl http://127.0.0.1:8000/metrics

That's it! You have just hosted your stable-diffusion model as a service! 😉

Examples

More ready-to-use examples can be found in the Example section. It includes:

Configuration

  • Dynamic batching
    • max_batch_size is configured when you append_worker (make sure inference with the max value won't cause the out-of-memory in GPU).
    • --wait (default=10ms) is configured through CLI arguments (this usually should <= one batch inference duration).
    • If enabled, it will collect a batch either when it reaches the max_batch_size or the wait time.
  • Check the arguments doc.

Deployment

  • This may require some shared memory, remember to set the --shm-size flag if you are using docker.
  • This service doesn't require Gunicorn or NGINX, but you can certainly use the ingress controller. BTW, it should be the PID 1 process in the container since it controls multiple processes.
  • Remember to collect the metrics.
    • mosec_service_batch_size_bucket shows the batch size distribution.
    • mosec_service_batch_duration_second_bucket shows the duration of dynamic batching for each connection in each stage (starts from receiving the first task).
    • mosec_service_process_duration_second_bucket shows the duration of processing for each connection in each stage (including the IPC time but excluding the mosec_service_batch_duration_second_bucket).
    • mosec_service_remaining_task shows the number of currently processing tasks.
    • mosec_service_throughput shows the service throughput.
  • Stop the service with SIGINT or SIGTERM since it has the graceful shutdown logic.

Adopters

Here are some of the companies and individual users that are using Mosec:

  • MOSS: An open sourced conversational language model like ChatGPT.
  • TensorChord: A platform for building and deploying AI models.

Citation

If you find this software useful for your research, please consider citing

@software{yang2021mosec,
  title = {{MOSEC: Model Serving made Efficient in the Cloud}},
  author = {Yang, Keming and Liu, Zichen and Cheng, Philip},
  url = {https://github.com/mosecorg/mosec},
  year = {2021}
}

Contributing

We welcome any kind of contribution. Please give us feedback by raising issues or discussing on Discord. You could also directly contribute your code and pull request!

To start develop, you can use envd to create an isolated and clean Python & Rust environment. Check the envd-docs or build.envd for more information.

Qualitative Comparison*

Batcher Pipeline Parallel I/O Format(1) Framework(2) Backend Activity
TF Serving Limited(a) Heavily TF C++
Triton Limited Multiple C++
MMS Limited Heavily MX Java
BentoML Limited(b) Multiple Python
Streamer Customizable Agnostic Python
Flask(3) Customizable Agnostic Python
Mosec Customizable Agnostic Rust

*As accessed on 08 Oct 2021. By no means is this comparison showing that other frameworks are inferior, but rather it is used to illustrate the trade-off. The information is not guaranteed to be absolutely accurate. Please let us know if you find anything that may be incorrect.

(1): Data format of the service's request and response. "Limited" in the sense that the framework has pre-defined requirements on the format. (2): Supported machine learning frameworks. "Heavily" means the serving framework is designed towards a specific ML framework. Thus it is hard, if not impossible, to adapt to others. "Multiple" means the serving framework provides adaptation to several existing ML frameworks. "Agnostic" means the serving framework does not necessarily care about the ML framework. Hence it supports all ML frameworks (in Python). (3): Flask is a representative of general purpose web frameworks to host ML models.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mosec-0.6.3.tar.gz (63.2 kB view details)

Uploaded Source

Built Distributions

mosec-0.6.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB view details)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

mosec-0.6.3-pp39-pypy39_pp73-macosx_10_9_x86_64.whl (1.4 MB view details)

Uploaded PyPy macOS 10.9+ x86-64

mosec-0.6.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB view details)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

mosec-0.6.3-pp38-pypy38_pp73-macosx_10_9_x86_64.whl (1.4 MB view details)

Uploaded PyPy macOS 10.9+ x86-64

mosec-0.6.3-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB view details)

Uploaded PyPy manylinux: glibc 2.17+ x86-64

mosec-0.6.3-pp37-pypy37_pp73-macosx_10_9_x86_64.whl (1.4 MB view details)

Uploaded PyPy macOS 10.9+ x86-64

mosec-0.6.3-cp311-cp311-musllinux_1_1_x86_64.whl (2.5 MB view details)

Uploaded CPython 3.11 musllinux: musl 1.1+ x86-64

mosec-0.6.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

mosec-0.6.3-cp311-cp311-macosx_10_9_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.11 macOS 10.9+ x86-64

mosec-0.6.3-cp310-cp310-musllinux_1_1_x86_64.whl (2.5 MB view details)

Uploaded CPython 3.10 musllinux: musl 1.1+ x86-64

mosec-0.6.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

mosec-0.6.3-cp310-cp310-macosx_10_9_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

mosec-0.6.3-cp39-cp39-musllinux_1_1_x86_64.whl (2.5 MB view details)

Uploaded CPython 3.9 musllinux: musl 1.1+ x86-64

mosec-0.6.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

mosec-0.6.3-cp39-cp39-macosx_10_9_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

mosec-0.6.3-cp38-cp38-musllinux_1_1_x86_64.whl (2.5 MB view details)

Uploaded CPython 3.8 musllinux: musl 1.1+ x86-64

mosec-0.6.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

mosec-0.6.3-cp38-cp38-macosx_10_9_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

mosec-0.6.3-cp37-cp37m-musllinux_1_1_x86_64.whl (2.5 MB view details)

Uploaded CPython 3.7m musllinux: musl 1.1+ x86-64

mosec-0.6.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

mosec-0.6.3-cp37-cp37m-macosx_10_9_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

File details

Details for the file mosec-0.6.3.tar.gz.

File metadata

  • Download URL: mosec-0.6.3.tar.gz
  • Upload date:
  • Size: 63.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for mosec-0.6.3.tar.gz
Algorithm Hash digest
SHA256 b5d177e3625ff2d4206614735dfb63740c5aee4701f2a674dec8bcb79203fd15
MD5 89aeede56c4e3d4dc62dd41a087285df
BLAKE2b-256 26e370eb9e2e39a5fa6dc57d8c054f9dba8f9fd29c52fc132ebaaeba2975d08d

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 59b8f5ccbcf4c86b1590f30d39ee144d17cbbb703440f817ab92f4af569a53e3
MD5 e501a933bb16c00c5a0ee739ecd0ae02
BLAKE2b-256 dcc098bdb1e0befe0420ab8aad955e65dcf3b5fa8d8ecd77e2a33294e67d9462

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-pp39-pypy39_pp73-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-pp39-pypy39_pp73-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 5a1cdc9b19a6343ea954cf580c7a36888dcd737699ca3e4433e3132b30ea9339
MD5 152410d42a7936fd8fa80f7eee66ee04
BLAKE2b-256 0bd37360e5d91ceae24fbecd41d80d256854e88bdd919b3b1e654f9cd7412718

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 72f74c0d5546746f7d20649f961a1c8c4d395209f849c3b14b3af64dac8bda52
MD5 0ccb3e4074e5034cedac425a61066b2a
BLAKE2b-256 2ca6ef9f58d7aa2a694bac4cf61606d51026ba77c8e7ca9f3635d9176b573d09

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-pp38-pypy38_pp73-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-pp38-pypy38_pp73-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 24e20509311d935e32f1b12f5221a4b4957b5e4bd388f9261e6fda5ee2ff0518
MD5 5fde1c9911629a92d4c0da97b24fa785
BLAKE2b-256 ab4ad64a7968c2b5831f1e947275a14f5003ba4f22058dcc46d51d5aa396a4da

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 635df62a55abb73cebad273e4452422e97383f4497841ca431ba3767c44b1688
MD5 b252dbcdcd0abff7e38673e64e426e36
BLAKE2b-256 3c2a48d915e2cfc75d5bd3e4b4b30aa584aebbe1b32962feb6bb62fb6cdce999

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-pp37-pypy37_pp73-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-pp37-pypy37_pp73-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 28d6e6c6d641cc3fb763c7d1fc3c4683c4f8e854eeb8d94f9e8d6c93a771a983
MD5 e194d491e51d86fe528c7c41c0e940fb
BLAKE2b-256 3cb4658ac6da8b8fc69feebc78aaaf6c0cfde309691c024853f733c78b02bdd1

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp311-cp311-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp311-cp311-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 fecce09fbc3e4b674b6ec93443cc4ba28a36a70c23aafd2517d01001ad9bfb56
MD5 7f957a32de84cf6b8c3ecffce72c6666
BLAKE2b-256 52b428641e92bd698da2a6c2aea27c16cff52757359c7a8a9f3bcb76994f4b17

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a17b4d2b9d528a99859f6d0181355f3f0948360f074eab9a038248b15d97a0eb
MD5 8f3b60ea086f1a5ac07e0fe26930de38
BLAKE2b-256 60bd2d94f1c7c70f312b2053d313691a8f890ba0f6046fc3361299b54f54ae40

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 c7c40a47cd13b7bb015dca61e74738bde1e4176ce1594860f2481617b7cc39c1
MD5 1047b2db167171f70b687e5bf9222be6
BLAKE2b-256 73c0e10a25caedeb2413c8630df40eaea73f6e6cd41c10d47160663b0db148e1

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp310-cp310-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp310-cp310-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 268fed10e68702e4216cd9033d089ad6831eee27c7666ef047a6db66f59615e5
MD5 121a19e6bcc5255a33f16fd6593460f8
BLAKE2b-256 86e4da1b8664266ab6948c6649b8156178036c6350a68de74876b371b3c5ee75

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f9a6632e478fb7637d861a999d99e883ee6c8549530fc6c55ecc2a112bd6caa8
MD5 f87db1f80cb8cbcecf66b1fe8f5509af
BLAKE2b-256 cea02c0dbd6a3a826b9a6acdb8bc8955e98794d304b84dff70485dcb1c44a6ed

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 8fa21b647c8bd5e618a571c2ff76505c57eef269c62dd8b9e81eea4501e0c999
MD5 c4e1286dd1875513f373106905188851
BLAKE2b-256 06760bad84c4a0d647677f9299c8d60813a9d4ba6816bea5f1dbd998c3382810

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp39-cp39-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp39-cp39-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 d83fcf5c830ac0dc690c26beb478edc606ede995c6b7172d2aafe112b12a8ee9
MD5 241191b2b33891ed191e4a145a4aa8e2
BLAKE2b-256 12b959b1107b582a362409a02fa3ca7ba8de44952dfcdce7a82abbf5053e8c63

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d3235439e1d55761fd4e0bcf53f1a072e983d2dfcbbb6520aeb269e2ac2f488b
MD5 d0c58880a05b978a3029d9edbaca03ca
BLAKE2b-256 2339bd7c09c857d001f3cd049d9ac3472f6e1bb4eb8a43dbe0bdf0ab92acc755

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 6e2951a9b430084738d448886d6ab013b782e32a26c1db1fa10533cf826acc07
MD5 1ff608452edefcf9567e87c142b4b406
BLAKE2b-256 379e103ec46ac5bbc3c73cf7a29a3dceb422d9b11ca5b95bd98b233bd61d9536

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp38-cp38-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp38-cp38-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 d82daceab3aec43b800a0a47b0ce96076c007d5555069c01ba0c71bd1f7a3933
MD5 a5f59c5e1cb3ef3351bb47ef7b97eeec
BLAKE2b-256 a26382529f69770ea4fce57582363ecad390bcc8ac92c14ad92568d3f705730a

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 fe96cb50c884aa494d774d46049f61f334eaea3c9e91508e7778487386c4f3c8
MD5 226c4e9ff267fa55441c578509333ceb
BLAKE2b-256 a0f7521cbdff2a7f484a2f2ba8dab9d31b4adea1956fef86b77fc7fc9b85b04e

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 d9767b9c7b107d4e9e211a528577047539dfd8216d1b991a3327b966138658de
MD5 24669f9d298f07f5e8390d3cb4e2ab1e
BLAKE2b-256 0e29fd33b5c2fb0f29ea336859e57e0914dacd127c5f4be18a03100cc69761b7

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp37-cp37m-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp37-cp37m-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 c1d9c4db870fa7fb588a71954cbb0a2fdd8a7064434c3480d7f326f968186fd6
MD5 79e45ce7a23d3da37a71c553da3983a4
BLAKE2b-256 7d45914733d9f2b20bcd2b3fc1474efb7f7db45ba1bea3c19bb4c46700777bab

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d6e9f559f0f402c5f78b8546b1159b8fa45bd0159bd15882328466a66b814cfb
MD5 40d29a88bbd3ec651e752eee37754270
BLAKE2b-256 30c193bbf275e9532ed0d7b2e41fd70f4ae928f61fd8792c0cb31a2de06d3411

See more details on using hashes here.

File details

Details for the file mosec-0.6.3-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.6.3-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 8430f0b2dd1026038dcf86fb55bfd7dcd6eef9ed2d416e2431ff2d411789cde9
MD5 ea6c6af7f8c1e3fcb95db239892d3831
BLAKE2b-256 94958962f8eb01d994e4d529cf397afa18dada820a4736a7e2d9cbacb82eb671

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page