Skip to main content

Model Serving made Efficient in the Cloud

Project description

MOSEC

discord invitation link PyPI version conda-forge Python Version PyPi monthly Downloads License Check status

Model Serving made Efficient in the Cloud.

Introduction

MOSEC

Mosec is a high-performance and flexible model serving framework for building ML model-enabled backend and microservices. It bridges the gap between any machine learning models you just trained and the efficient online service API.

  • Highly performant: web layer and task coordination built with Rust 🦀, which offers blazing speed in addition to efficient CPU utilization powered by async I/O
  • Ease of use: user interface purely in Python 🐍, by which users can serve their models in an ML framework-agnostic manner using the same code as they do for offline testing
  • Dynamic batching: aggregate requests from different users for batched inference and distribute results back
  • Pipelined stages: spawn multiple processes for pipelined stages to handle CPU/GPU/IO mixed workloads
  • Cloud friendly: designed to run in the cloud, with the model warmup, graceful shutdown, and Prometheus monitoring metrics, easily managed by Kubernetes or any container orchestration systems
  • Do one thing well: focus on the online serving part, users can pay attention to the model optimization and business logic

Installation

Mosec requires Python 3.7 or above. Install the latest PyPI package for Linux x86_64 or macOS x86_64/ARM64 with:

pip install -U mosec
# or install with conda
conda install conda-forge::mosec

To build from the source code, install Rust and run the following command:

make package

You will get a mosec wheel file in the dist folder.

Usage

We demonstrate how Mosec can help you easily host a pre-trained stable diffusion model as a service. You need to install diffusers and transformers as prerequisites:

pip install --upgrade diffusers[torch] transformers

Write the server

Click me for server codes with explanations.

Firstly, we import the libraries and set up a basic logger to better observe what happens.

from io import BytesIO
from typing import List

import torch  # type: ignore
from diffusers import StableDiffusionPipeline  # type: ignore

from mosec import Server, Worker, get_logger
from mosec.mixin import MsgpackMixin

logger = get_logger()

Then, we build an API for clients to query a text prompt and obtain an image based on the stable-diffusion-v1-5 model in just 3 steps.

  1. Define your service as a class which inherits mosec.Worker. Here we also inherit MsgpackMixin to employ the msgpack serialization format(a).

  2. Inside the __init__ method, initialize your model and put it onto the corresponding device. Optionally you can assign self.example with some data to warm up(b) the model. Note that the data should be compatible with your handler's input format, which we detail next.

  3. Override the forward method to write your service handler(c), with the signature forward(self, data: Any | List[Any]) -> Any | List[Any]. Receiving/returning a single item or a tuple depends on whether dynamic batching(d) is configured.

class StableDiffusion(MsgpackMixin, Worker):
    def __init__(self):
        self.pipe = StableDiffusionPipeline.from_pretrained(
            "sd-legacy/stable-diffusion-v1-5", torch_dtype=torch.float16
        )
        self.pipe.enable_model_cpu_offload()
        self.example = ["useless example prompt"] * 4  # warmup (batch_size=4)

    def forward(self, data: List[str]) -> List[memoryview]:
        logger.debug("generate images for %s", data)
        res = self.pipe(data)
        logger.debug("NSFW: %s", res[1])
        images = []
        for img in res[0]:
            dummy_file = BytesIO()
            img.save(dummy_file, format="JPEG")
            images.append(dummy_file.getbuffer())
        return images

[!NOTE]

(a) In this example we return an image in the binary format, which JSON does not support (unless encoded with base64 that makes the payload larger). Hence, msgpack suits our need better. If we do not inherit MsgpackMixin, JSON will be used by default. In other words, the protocol of the service request/response can be either msgpack, JSON, or any other format (check our mixins).

(b) Warm-up usually helps to allocate GPU memory in advance. If the warm-up example is specified, the service will only be ready after the example is forwarded through the handler. However, if no example is given, the first request's latency is expected to be longer. The example should be set as a single item or a tuple depending on what forward expects to receive. Moreover, in the case where you want to warm up with multiple different examples, you may set multi_examples (demo here).

(c) This example shows a single-stage service, where the StableDiffusion worker directly takes in client's prompt request and responds the image. Thus the forward can be considered as a complete service handler. However, we can also design a multi-stage service with workers doing different jobs (e.g., downloading images, model inference, post-processing) in a pipeline. In this case, the whole pipeline is considered as the service handler, with the first worker taking in the request and the last worker sending out the response. The data flow between workers is done by inter-process communication.

(d) Since dynamic batching is enabled in this example, the forward method will wishfully receive a list of string, e.g., ['a cute cat playing with a red ball', 'a man sitting in front of a computer', ...], aggregated from different clients for batch inference, improving the system throughput.

Finally, we append the worker to the server to construct a single-stage workflow (multiple stages can be pipelined to further boost the throughput, see this example), and specify the number of processes we want it to run in parallel (num=1), and the maximum batch size (max_batch_size=4, the maximum number of requests dynamic batching will accumulate before timeout; timeout is defined with the max_wait_time=10 in milliseconds, meaning the longest time Mosec waits until sending the batch to the Worker).

if __name__ == "__main__":
    server = Server()
    # 1) `num` specifies the number of processes that will be spawned to run in parallel.
    # 2) By configuring the `max_batch_size` with the value > 1, the input data in your
    # `forward` function will be a list (batch); otherwise, it's a single item.
    server.append_worker(StableDiffusion, num=1, max_batch_size=4, max_wait_time=10)
    server.run()

Run the server

Click me to see how to run and query the server.

The above snippets are merged in our example file. You may directly run at the project root level. We first have a look at the command line arguments (explanations here):

python examples/stable_diffusion/server.py --help

Then let's start the server with debug logs:

python examples/stable_diffusion/server.py --log-level debug --timeout 30000

Open http://127.0.0.1:8000/openapi/swagger/ in your browser to get the OpenAPI doc.

And in another terminal, test it:

python examples/stable_diffusion/client.py --prompt "a cute cat playing with a red ball" --output cat.jpg --port 8000

You will get an image named "cat.jpg" in the current directory.

You can check the metrics:

curl http://127.0.0.1:8000/metrics

That's it! You have just hosted your stable-diffusion model as a service! 😉

Examples

More ready-to-use examples can be found in the Example section. It includes:

Configuration

  • Dynamic batching
    • max_batch_size and max_wait_time (millisecond) are configured when you call append_worker.
    • Make sure inference with the max_batch_size value won't cause the out-of-memory in GPU.
    • Normally, max_wait_time should be less than the batch inference time.
    • If enabled, it will collect a batch either when the number of accumulated requests reaches max_batch_size or when max_wait_time has elapsed. The service will benefit from this feature when the traffic is high.
  • Check the arguments doc for other configurations.

Deployment

  • If you're looking for a GPU base image with mosec installed, you can check the official image mosecorg/mosec. For the complex use case, check out envd.
  • This service doesn't need Gunicorn or NGINX, but you can certainly use the ingress controller when necessary.
  • This service should be the PID 1 process in the container since it controls multiple processes. If you need to run multiple processes in one container, you will need a supervisor. You may choose Supervisor or Horust.
  • Remember to collect the metrics.
    • mosec_service_batch_size_bucket shows the batch size distribution.
    • mosec_service_batch_duration_second_bucket shows the duration of dynamic batching for each connection in each stage (starts from receiving the first task).
    • mosec_service_process_duration_second_bucket shows the duration of processing for each connection in each stage (including the IPC time but excluding the mosec_service_batch_duration_second_bucket).
    • mosec_service_remaining_task shows the number of currently processing tasks.
    • mosec_service_throughput shows the service throughput.
  • Stop the service with SIGINT (CTRL+C) or SIGTERM (kill {PID}) since it has the graceful shutdown logic.

Performance tuning

  • Find out the best max_batch_size and max_wait_time for your inference service. The metrics will show the histograms of the real batch size and batch duration. Those are the key information to adjust these two parameters.
  • Try to split the whole inference process into separate CPU and GPU stages (ref DistilBERT). Different stages will be run in a data pipeline, which will keep the GPU busy.
  • You can also adjust the number of workers in each stage. For example, if your pipeline consists of a CPU stage for preprocessing and a GPU stage for model inference, increasing the number of CPU-stage workers can help to produce more data to be batched for model inference at the GPU stage; increasing the GPU-stage workers can fully utilize the GPU memory and computation power. Both ways may contribute to higher GPU utilization, which consequently results in higher service throughput.
  • For multi-stage services, note that the data passing through different stages will be serialized/deserialized by the serialize_ipc/deserialize_ipc methods, so extremely large data might make the whole pipeline slow. The serialized data is passed to the next stage through rust by default, you could enable shared memory to potentially reduce the latency (ref RedisShmIPCMixin).
  • You should choose appropriate serialize/deserialize methods, which are used to decode the user request and encode the response. By default, both are using JSON. However, images and embeddings are not well supported by JSON. You can choose msgpack which is faster and binary compatible (ref Stable Diffusion).
  • Configure the threads for OpenBLAS or MKL. It might not be able to choose the most suitable CPUs used by the current Python process. You can configure it for each worker by using the env (ref custom GPU allocation).
  • Enable HTTP/2 from client side. mosec automatically adapts to user's protocol (e.g., HTTP/2) since v0.8.8.

Adopters

Here are some of the companies and individual users that are using Mosec:

Citation

If you find this software useful for your research, please consider citing

@software{yang2021mosec,
  title = {{MOSEC: Model Serving made Efficient in the Cloud}},
  author = {Yang, Keming and Liu, Zichen and Cheng, Philip},
  url = {https://github.com/mosecorg/mosec},
  year = {2021}
}

Contributing

We welcome any kind of contribution. Please give us feedback by raising issues or discussing on Discord. You could also directly contribute your code and pull request!

To start develop, you can use envd to create an isolated and clean Python & Rust environment. Check the envd-docs or build.envd for more information.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mosec-0.8.9.tar.gz (85.1 kB view details)

Uploaded Source

Built Distributions

mosec-0.8.9-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.13 manylinux: glibc 2.17+ x86-64

mosec-0.8.9-cp313-cp313-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.13 macOS 11.0+ ARM64

mosec-0.8.9-cp313-cp313-macosx_10_13_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.13 macOS 10.13+ x86-64

mosec-0.8.9-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.12 manylinux: glibc 2.17+ x86-64

mosec-0.8.9-cp312-cp312-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.12 macOS 11.0+ ARM64

mosec-0.8.9-cp312-cp312-macosx_10_13_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.12 macOS 10.13+ x86-64

mosec-0.8.9-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

mosec-0.8.9-cp311-cp311-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.11 macOS 11.0+ ARM64

mosec-0.8.9-cp311-cp311-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.11 macOS 10.9+ x86-64

mosec-0.8.9-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

mosec-0.8.9-cp310-cp310-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

mosec-0.8.9-cp310-cp310-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

mosec-0.8.9-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

mosec-0.8.9-cp39-cp39-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

mosec-0.8.9-cp39-cp39-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

mosec-0.8.9-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.0 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

mosec-0.8.9-cp38-cp38-macosx_11_0_arm64.whl (4.9 MB view details)

Uploaded CPython 3.8 macOS 11.0+ ARM64

mosec-0.8.9-cp38-cp38-macosx_10_9_x86_64.whl (4.9 MB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

File details

Details for the file mosec-0.8.9.tar.gz.

File metadata

  • Download URL: mosec-0.8.9.tar.gz
  • Upload date:
  • Size: 85.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for mosec-0.8.9.tar.gz
Algorithm Hash digest
SHA256 90db7b7333e7df0ff40718b4b4d76074df1b4a9468b7b68da52648a24d64fa38
MD5 c6865116a8c6f2f67cfb2160ce62af43
BLAKE2b-256 8e26650b6f5c5a7c83fdbdcf5ea1b0821a4774dafb2e680f3efd0fc9b87da093

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4aea8b9117e38ae3b06f6154e61e1c5d8d3b40f4b8896828aa543e29a51c3219
MD5 df905013fc7bd78426280266df2650f3
BLAKE2b-256 0ac116e466c4c793361f9d15bd40b80e273d7d3e46b0f5db01ac720c3a55c795

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6b7ba18732df747db61067d9727fd55624352699527aec2dc438c4336e170dce
MD5 b32b366d8d78b2569314b0462acf3db6
BLAKE2b-256 91e27ee40729deefda547c691680b7f4df30ee271c40da802455164ea632897e

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp313-cp313-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp313-cp313-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 35ddfc059e0f249f9bc2377f9bd19a4ae39e3b670ddc96ba18c708cbdee879c3
MD5 a2da59b6849ae8369fdcfdc059254ed3
BLAKE2b-256 fe58122c9988174b43a7937ddfa1f8e299a49dc7c8d5f05254ae8bf6f12f7ef6

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 0937cbeab3e0dea828f9c68e0b56c719bb7b630f68b546c81670fc12a2724721
MD5 95cd579839cb59241f014f8e8547e2fd
BLAKE2b-256 8e298d949e197759d9e3ff981d26b7e22da8291f135a738956cd5b840c86037e

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6abd88c1d3b5e213753857169b0fa2c3560069b86a31d38245aeadc456ae29d1
MD5 f36608bd1df2435fbf4fde7c78c925f2
BLAKE2b-256 256e25e510689d300405f0f4cd69a2a217e97eff1a3280caa2c5c74072cb1af9

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp312-cp312-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp312-cp312-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 84e1b0f770ccca14855f4a490d43d9598afd25adbeef0b17219d9425252ccc6a
MD5 9b2839e5fcec5d066fe28d5eae1bc12d
BLAKE2b-256 e3bfe10475e9186958fe3ff97eccbdb0d12b82498c0f734aa89e9d7a2475ed96

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a5257c6b4f647a69e40949d7ea66fbe52452c123490e21cd807467f65238e95d
MD5 22f517e49a3c43081b51a9fa11851826
BLAKE2b-256 ec4799793a2666005e752380fc3ad76a0b19e51de1156b748999c04b2391fada

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 60fc69c32125d1189b27c16bf534d618228f25a827335c25d4dc29d78871cd8c
MD5 ee9a965bcffdee92ce90634ea486d941
BLAKE2b-256 9282b9fb47d165bf3272bfecab04f1b3da736657295a66806e101a81627e261a

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 a88a6b29e19fa8a75eccb768f74dad82bb20af1e35bdc61cdaebb01db4d5bd7f
MD5 41c6c0b2ec7e2ab3dc3f29e5a87c4219
BLAKE2b-256 76d44a82ed9356c1de47178be9b2cd6aaf67b58621a428d2cf2d62b0e7ccb962

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 195e1d1847eff5776b14388290861657efdd469152ccfc49e73ddafda405b074
MD5 cc78efeaf5dc268aae67dddf4a438356
BLAKE2b-256 6518a9216538de2543563e4990109407c0e68529467af1f664c616251a4fab65

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 112c7305cf2d236f844f462b0e8bfb9726fd07e026cf9e66085ddf1f67a286c2
MD5 25dee18270a01ea413e671ab676a6917
BLAKE2b-256 b12cd738adf856ac3ea1c99ed979edc75002fd85bbc9539b364b9b5e8f9ed19f

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 944dda9cd0a383846cd10df8949cae4c098c1491f52b48250891bc0f0844d7be
MD5 8171f400dd2c5df9c57f4c7abc82934c
BLAKE2b-256 0af904db56467f5e26b56d447e865601723f594a77977d8f48ea993f3def66c8

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 52a353bad375abed9245cdb3233c080d5f2f9cd93793a5241034c48226b89e02
MD5 ed3c25d3a76e1e963774e39cee0bd2e1
BLAKE2b-256 5bc1c48c36a18d67a4da49198284534fdec57f8fcf35698c0a2dad80424ce64a

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 7996d4c6a9d0e243529f31a7e7762b778b8dcd2224b50fb0c05a8b7cac5a4ec1
MD5 deb7f1b899a1999fb7fb46064012b419
BLAKE2b-256 b91b89414ddf14ee75b1c143dfe5914e3e858898848efc5f89924fbfbfaa4abe

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 0026e083ef17ed17414c0a19de22df3849357d3861441c0d8b92abde4545219e
MD5 4adce4064226ab731269bef20056c872
BLAKE2b-256 e0d285157d68236ad1dcd65e85b2c2190878fc41e6e32d7cebe2089b644faff9

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ec4bb1f9e7cf855caf9942acb20e738589ddac69c7fa9b4e895892118e57c27b
MD5 3cff097a59323834e0e3111cf513fef0
BLAKE2b-256 dac78ab10ec2caddd8ceae537811eb0a65a4bd5661af6eaef3b0f2dabc7afd8c

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 2b1996bd04b3313639c1f893ca9cd14ef65280b2296e4e53494eacb4dd289424
MD5 58d8c9508dc9a92037b49bda25149cb1
BLAKE2b-256 516cf681697bbc5a6e21a1c391366e31a5482fa0ab0dd79b8fe264a7ebfc8662

See more details on using hashes here.

File details

Details for the file mosec-0.8.9-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for mosec-0.8.9-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 0d5d38ff2f2bdbca0f9cb0d7f0c965db335efa7a3a98298bc8a70462dd258fca
MD5 0b815741c94737abf05d30f4ad84f30f
BLAKE2b-256 c3ade3823574cce41f01a84652f147c8444f7fe49b1a8eb526c72e5f4df8ff46

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page