Skip to main content

Model Serving made Efficient in the Cloud

Project description

MOSEC

discord invitation link PyPI version conda-forge Python Version PyPi monthly Downloads License Check status

Model Serving made Efficient in the Cloud.

Introduction

MOSEC

Mosec is a high-performance and flexible model serving framework for building ML model-enabled backend and microservices. It bridges the gap between any machine learning models you just trained and the efficient online service API.

  • Highly performant: web layer and task coordination built with Rust 🦀, which offers blazing speed in addition to efficient CPU utilization powered by async I/O
  • Ease of use: user interface purely in Python 🐍, by which users can serve their models in an ML framework-agnostic manner using the same code as they do for offline testing
  • Dynamic batching: aggregate requests from different users for batched inference and distribute results back
  • Pipelined stages: spawn multiple processes for pipelined stages to handle CPU/GPU/IO mixed workloads
  • Cloud friendly: designed to run in the cloud, with the model warmup, graceful shutdown, and Prometheus monitoring metrics, easily managed by Kubernetes or any container orchestration systems
  • Do one thing well: focus on the online serving part, users can pay attention to the model optimization and business logic

Installation

Mosec requires Python 3.7 or above. Install the latest PyPI package for Linux x86_64 or macOS x86_64/ARM64 with:

pip install -U mosec
# or install with conda
conda install conda-forge::mosec

To build from the source code, install Rust and run the following command:

make package

You will get a mosec wheel file in the dist folder.

Usage

We demonstrate how Mosec can help you easily host a pre-trained stable diffusion model as a service. You need to install diffusers and transformers as prerequisites:

pip install --upgrade diffusers[torch] transformers

Write the server

Click me for server codes with explanations.

Firstly, we import the libraries and set up a basic logger to better observe what happens.

from io import BytesIO
from typing import List

import torch  # type: ignore
from diffusers import StableDiffusionPipeline  # type: ignore

from mosec import Server, Worker, get_logger
from mosec.mixin import MsgpackMixin

logger = get_logger()

Then, we build an API for clients to query a text prompt and obtain an image based on the stable-diffusion-v1-5 model in just 3 steps.

  1. Define your service as a class which inherits mosec.Worker. Here we also inherit MsgpackMixin to employ the msgpack serialization format(a).

  2. Inside the __init__ method, initialize your model and put it onto the corresponding device. Optionally you can assign self.example with some data to warm up(b) the model. Note that the data should be compatible with your handler's input format, which we detail next.

  3. Override the forward method to write your service handler(c), with the signature forward(self, data: Any | List[Any]) -> Any | List[Any]. Receiving/returning a single item or a tuple depends on whether dynamic batching(d) is configured.

class StableDiffusion(MsgpackMixin, Worker):
    def __init__(self):
        self.pipe = StableDiffusionPipeline.from_pretrained(
            "sd-legacy/stable-diffusion-v1-5", torch_dtype=torch.float16
        )
        self.pipe.enable_model_cpu_offload()
        self.example = ["useless example prompt"] * 4  # warmup (batch_size=4)

    def forward(self, data: List[str]) -> List[memoryview]:
        logger.debug("generate images for %s", data)
        res = self.pipe(data)
        logger.debug("NSFW: %s", res[1])
        images = []
        for img in res[0]:
            dummy_file = BytesIO()
            img.save(dummy_file, format="JPEG")
            images.append(dummy_file.getbuffer())
        return images

[!NOTE]

(a) In this example we return an image in the binary format, which JSON does not support (unless encoded with base64 that makes the payload larger). Hence, msgpack suits our need better. If we do not inherit MsgpackMixin, JSON will be used by default. In other words, the protocol of the service request/response can be either msgpack, JSON, or any other format (check our mixins).

(b) Warm-up usually helps to allocate GPU memory in advance. If the warm-up example is specified, the service will only be ready after the example is forwarded through the handler. However, if no example is given, the first request's latency is expected to be longer. The example should be set as a single item or a tuple depending on what forward expects to receive. Moreover, in the case where you want to warm up with multiple different examples, you may set multi_examples (demo here).

(c) This example shows a single-stage service, where the StableDiffusion worker directly takes in client's prompt request and responds the image. Thus the forward can be considered as a complete service handler. However, we can also design a multi-stage service with workers doing different jobs (e.g., downloading images, model inference, post-processing) in a pipeline. In this case, the whole pipeline is considered as the service handler, with the first worker taking in the request and the last worker sending out the response. The data flow between workers is done by inter-process communication.

(d) Since dynamic batching is enabled in this example, the forward method will wishfully receive a list of string, e.g., ['a cute cat playing with a red ball', 'a man sitting in front of a computer', ...], aggregated from different clients for batch inference, improving the system throughput.

Finally, we append the worker to the server to construct a single-stage workflow (multiple stages can be pipelined to further boost the throughput, see this example), and specify the number of processes we want it to run in parallel (num=1), and the maximum batch size (max_batch_size=4, the maximum number of requests dynamic batching will accumulate before timeout; timeout is defined with the max_wait_time=10 in milliseconds, meaning the longest time Mosec waits until sending the batch to the Worker).

if __name__ == "__main__":
    server = Server()
    # 1) `num` specifies the number of processes that will be spawned to run in parallel.
    # 2) By configuring the `max_batch_size` with the value > 1, the input data in your
    # `forward` function will be a list (batch); otherwise, it's a single item.
    server.append_worker(StableDiffusion, num=1, max_batch_size=4, max_wait_time=10)
    server.run()

Run the server

Click me to see how to run and query the server.

The above snippets are merged in our example file. You may directly run at the project root level. We first have a look at the command line arguments (explanations here):

python examples/stable_diffusion/server.py --help

Then let's start the server with debug logs:

python examples/stable_diffusion/server.py --log-level debug --timeout 30000

Open http://127.0.0.1:8000/openapi/swagger/ in your browser to get the OpenAPI doc.

And in another terminal, test it:

python examples/stable_diffusion/client.py --prompt "a cute cat playing with a red ball" --output cat.jpg --port 8000

You will get an image named "cat.jpg" in the current directory.

You can check the metrics:

curl http://127.0.0.1:8000/metrics

That's it! You have just hosted your stable-diffusion model as a service! 😉

Examples

More ready-to-use examples can be found in the Example section. It includes:

Configuration

  • Dynamic batching
    • max_batch_size and max_wait_time (millisecond) are configured when you call append_worker.
    • Make sure inference with the max_batch_size value won't cause the out-of-memory in GPU.
    • Normally, max_wait_time should be less than the batch inference time.
    • If enabled, it will collect a batch either when the number of accumulated requests reaches max_batch_size or when max_wait_time has elapsed. The service will benefit from this feature when the traffic is high.
  • Check the arguments doc for other configurations.

Deployment

  • If you're looking for a GPU base image with mosec installed, you can check the official image mosecorg/mosec. For the complex use case, check out envd.
  • This service doesn't need Gunicorn or NGINX, but you can certainly use the ingress controller when necessary.
  • This service should be the PID 1 process in the container since it controls multiple processes. If you need to run multiple processes in one container, you will need a supervisor. You may choose Supervisor or Horust.
  • Remember to collect the metrics.
    • mosec_service_batch_size_bucket shows the batch size distribution.
    • mosec_service_batch_duration_second_bucket shows the duration of dynamic batching for each connection in each stage (starts from receiving the first task).
    • mosec_service_process_duration_second_bucket shows the duration of processing for each connection in each stage (including the IPC time but excluding the mosec_service_batch_duration_second_bucket).
    • mosec_service_remaining_task shows the number of currently processing tasks.
    • mosec_service_throughput shows the service throughput.
  • Stop the service with SIGINT (CTRL+C) or SIGTERM (kill {PID}) since it has the graceful shutdown logic.

Performance tuning

  • Find out the best max_batch_size and max_wait_time for your inference service. The metrics will show the histograms of the real batch size and batch duration. Those are the key information to adjust these two parameters.
  • Try to split the whole inference process into separate CPU and GPU stages (ref DistilBERT). Different stages will be run in a data pipeline, which will keep the GPU busy.
  • You can also adjust the number of workers in each stage. For example, if your pipeline consists of a CPU stage for preprocessing and a GPU stage for model inference, increasing the number of CPU-stage workers can help to produce more data to be batched for model inference at the GPU stage; increasing the GPU-stage workers can fully utilize the GPU memory and computation power. Both ways may contribute to higher GPU utilization, which consequently results in higher service throughput.
  • For multi-stage services, note that the data passing through different stages will be serialized/deserialized by the serialize_ipc/deserialize_ipc methods, so extremely large data might make the whole pipeline slow. The serialized data is passed to the next stage through rust by default, you could enable shared memory to potentially reduce the latency (ref RedisShmIPCMixin).
  • You should choose appropriate serialize/deserialize methods, which are used to decode the user request and encode the response. By default, both are using JSON. However, images and embeddings are not well supported by JSON. You can choose msgpack which is faster and binary compatible (ref Stable Diffusion).
  • Configure the threads for OpenBLAS or MKL. It might not be able to choose the most suitable CPUs used by the current Python process. You can configure it for each worker by using the env (ref custom GPU allocation).
  • Enable HTTP/2 from client side. mosec automatically adapts to user's protocol (e.g., HTTP/2) since v0.8.8.

Adopters

Here are some of the companies and individual users that are using Mosec:

Citation

If you find this software useful for your research, please consider citing

@software{yang2021mosec,
  title = {{MOSEC: Model Serving made Efficient in the Cloud}},
  author = {Yang, Keming and Liu, Zichen and Cheng, Philip},
  url = {https://github.com/mosecorg/mosec},
  year = {2021}
}

Contributing

We welcome any kind of contribution. Please give us feedback by raising issues or discussing on Discord. You could also directly contribute your code and pull request!

To start develop, you can use envd to create an isolated and clean Python & Rust environment. Check the envd-docs or build.envd for more information.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mosec-0.8.8.tar.gz (86.8 kB view details)

Uploaded Source

Built Distributions

mosec-0.8.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.13 manylinux: glibc 2.17+ x86-64

mosec-0.8.8-cp313-cp313-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.13 macOS 11.0+ ARM64

mosec-0.8.8-cp313-cp313-macosx_10_13_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.13 macOS 10.13+ x86-64

mosec-0.8.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.12 manylinux: glibc 2.17+ x86-64

mosec-0.8.8-cp312-cp312-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.12 macOS 11.0+ ARM64

mosec-0.8.8-cp312-cp312-macosx_10_13_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.12 macOS 10.13+ x86-64

mosec-0.8.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

mosec-0.8.8-cp311-cp311-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.11 macOS 11.0+ ARM64

mosec-0.8.8-cp311-cp311-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.11 macOS 10.9+ x86-64

mosec-0.8.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

mosec-0.8.8-cp310-cp310-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

mosec-0.8.8-cp310-cp310-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

mosec-0.8.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

mosec-0.8.8-cp39-cp39-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

mosec-0.8.8-cp39-cp39-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

mosec-0.8.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

mosec-0.8.8-cp38-cp38-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.8 macOS 11.0+ ARM64

mosec-0.8.8-cp38-cp38-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

File details

Details for the file mosec-0.8.8.tar.gz.

File metadata

  • Download URL: mosec-0.8.8.tar.gz
  • Upload date:
  • Size: 86.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for mosec-0.8.8.tar.gz
Algorithm Hash digest
SHA256 5047b6e7f0abe64e458bf5c24378659b4fca1d3b0cb459fab0480a6555429161
MD5 c2035e22c4f01e39f73595db51b8099c
BLAKE2b-256 87aa9179da0625d6608776f9e6ab1da233c518895db67de747a1f9a6aa14dceb

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 da860043fa7180e5ddc93c688da683adb92b10930819991b98536b11b4a43777
MD5 52d215152a6103fb368a36dff2945a65
BLAKE2b-256 2426b41f5483f0aa5f1071929043bdbe5fa5564bede8f5f655e33dfd0e0efffc

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 70ae2d3b33cb399ad9e0d1c6caaff4fca1b73610608d8f34ff27673c704ff0b1
MD5 73f6675784b1c5e89267bb284b292c1d
BLAKE2b-256 8b859ae6a0da5c6a7aabfe62c7e58ba786c5b0dd92d7379b5655fed2f329ff0d

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp313-cp313-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp313-cp313-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 2eda992e541687cf1eb05f54a3fd62ee44ad1e6f71724e9a3cc80f7d9fb6239c
MD5 45862439725d41b8bf6e030a36efebcb
BLAKE2b-256 f5cb615e1d1df42e52953cf695fae3c228881ff37d2fa1436507e4148efdbef2

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 1004a7e30c633e048a8cee81ce909f68a700884cee19094b804ce16dd1e11f5e
MD5 a91d244fe93dc0967e5c56225106e6e1
BLAKE2b-256 a38fc0a15bbcb880987a156316c9b2df5f03ed47b4c8c8d385321d22825bb17c

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 83268a168fea8519ff914842b3f05da3fcc0089cd37c34608cf8d176abcf326f
MD5 7c3240346e91671c1feb91406c4b9bf2
BLAKE2b-256 5d0bca614562e10e97bb9da7b0635340a47c217817d33dc26870556ef5a31487

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp312-cp312-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp312-cp312-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 9c478034d4808c7a301b61b295606f0e2b923852e014811f653a18bea6e9e5c0
MD5 cdbd10bab7bea4843b0d3c29efc9b393
BLAKE2b-256 28b59060859b912441945577951bb6f4a6b70a2916336be96b50f79c9a1bca03

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 6d0ac34e978734f8a0c83f07e9752c7be827dcd84093bc159e500d39f4a8b559
MD5 045919cc84d4d611767fe7c173bfa029
BLAKE2b-256 be5bda4ce77c2d43d480ca2ff3321b5ee70b5e750cddeef68d6eae5072680dad

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 fdf949fdcd8f1dee745e8d61df805ce7556db425f55d9de51fe610b77b911685
MD5 88c09a5c2bbbcc8bcbecafd3da24a2f8
BLAKE2b-256 48109cddc185e1e9a10a74e8b02a199a6d1b47bfc88448b1a82bd0c70fd4a453

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 adcda1275a078afe5b43f2383581ecd23fcb8ddf4825a9e0bc84b1e46714f840
MD5 c95cb145447d130ce6b8c29edfc0b9b8
BLAKE2b-256 0da9e1ccae029cef462a56cee0b5cae2d7bc8609221a2783fe32f5a7808e9ae5

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 93e92936ecc08668a8fdcc9a2059dc9894768ebb9a94fbdd0f7f60da16ebbe39
MD5 523d5113726c7856c3748f77c9f34e80
BLAKE2b-256 bacc3f7ae70eb897b5bbcd42eeca35c18b7d24cbe75e7c7157092ae33a9daa78

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 9565f2d5983b36f0f16705e0d34ec9355dd18653ffd96e0e8013174378240b15
MD5 bf55b83d9b2a9146b8eacab172c69a0c
BLAKE2b-256 fa40911a42284cf3e45a4e79ee5df6c35a2c318f4f836c1972e7159f376f8186

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 84c4581d0c7e045c6c7894750324d7f6cd4a7d1af5ad2c5ff4fac8796435fc87
MD5 edb9013be8bc02cab373ed8dee0a0e6e
BLAKE2b-256 369593d4706b85f811da177329389f2eb12b2bfb60fba6405ba61662b4606663

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f3ebdb8038bf9a870f5270e63d5a371918a224333a17eb952c693c0a1f67f842
MD5 b82be6a5557d57edf396c2acfc31831e
BLAKE2b-256 3647e305389d7106baf2b3078243a6fdbb1a646f741bb923c8214f321e1d5c7a

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 eca955b215912932d45d4dc0a7e605b7dd542ec968ea45d8a80acff5f4734313
MD5 85b239bc33a8b1d0c038c1e91fd6ab7d
BLAKE2b-256 bf213e281ceea575deb13539ea625543891850a1ccbdef35919b6dc680cc1996

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 1b4f897bb30e3afac3d87047cffd47f320c410119df97624680e09ec6afe79c8
MD5 3780e4e01aa1131cdac5defc2496402e
BLAKE2b-256 9fc471581305d65a814a6dd8eca076303a5c611f4ae6daf6faceff748a3d422a

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f4fc851cf1470ac94484926abdd1ea24a99f2964662fb2a5801e72a88ec99706
MD5 f156e25cf49dc892952c6f987f03c108
BLAKE2b-256 6b547e46b0f7a2cfb30724c5cbc770dcb19862cbd9a6815138965ad64df3653e

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6bf9a9995cd9392c46d95d407591d56586ed4d1499e239f3a2ac454e702d6835
MD5 19fcaa0e1854aaa05b67ebafd59df728
BLAKE2b-256 ab1f1a16037fa5fc7608fbe23060ef6591e748f8a19b81a95df4e4affebc9baa

See more details on using hashes here.

File details

Details for the file mosec-0.8.8-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.8-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 add1e1bde56064aa4e56c430380799e425ba3feb10b598a63abd9630dd62b397
MD5 6be71e9bfc554d4f84defddd4862bfc3
BLAKE2b-256 6394e47d08d3bd8bd080a41d464e81faab6252ac2ef76513fb054eeab94f8e20

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page