Skip to main content

moriarty

Project description

codecov

moriarty

Moriarty is a set of components for building asynchronous inference cluster.

Relying on cloud vendors or self-built global queue services, asynchronous inference clusters can be built without exposing ports to the public.

Why asynchronous inference, why moriarty?

  • Preventing client timeout.
  • Avoid HTTP disconnection due to network or other issues.
  • Reducing HTTP queries with queues.
  • Deploy on Multi/Hybrid/Private cloud, even on bare metal.

Alternatives

This project came from my deep use of Asynchronous Inferenc for AWS Sagemaker, and as far as I know, only AWS and Aliyun provide asynchronous inference support.

For open source projects, there are many deployment solutions, but most of them are synchronous inference (based on HTTP or RPC).I don't find any alternative for async inference. Maybe Kubeflow pipeline can be used for asynchronous inference. But without serving support(Leave model in GPU as a service, not load per job), there is a significant overhead of GPU memory cache and model load time.

Architecture Overview

Architecture Overview

Key Components:

  • Matrix: single producer, multiple consumers. Connector as producer, provide HTTP API for Backend Service and push invoke request to the global Job Queue. Operator as consumer, pull tasks from the Job Queue and push them to local queue. Pulling or not depends on the load of inference cluster. And also, Operator will autoscale inference container if needed.
  • Endpoint: Deploy a function as an HTTP service.
  • Sidecar: Proxy and transform queue message into HTTP request.
  • Init: Init script for inference container

CLIs:

  • moriarty-matrix: Manager matrix components
  • moriarty-operator: Start the operator component
  • moriarty-connector: Start the connector component
  • moriarty-sidecar: Start the sidecar component
  • moriarty-deploy: Request matrix Operator's API or database for deploy inference endpoint.

Install

pip install moriarty[matrix] for all components.

Or use docker image

docker pull wh1isper/moriarty

docker pull wh1isper/moriarty:dev for developing version

Develop

Install pre-commit before commit

pip install pre-commit
pre-commit install

Install package locally with test dependencies

pip install -e .[test]

Run tests with pytest

pytest -v tests/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

moriarty-0.0.6.tar.gz (182.5 kB view details)

Uploaded Source

Built Distribution

moriarty-0.0.6-py3-none-any.whl (54.2 kB view details)

Uploaded Python 3

File details

Details for the file moriarty-0.0.6.tar.gz.

File metadata

  • Download URL: moriarty-0.0.6.tar.gz
  • Upload date:
  • Size: 182.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for moriarty-0.0.6.tar.gz
Algorithm Hash digest
SHA256 d899384cd1414fa993a4349fd1206241484b3f4ba5d56f359d883db4f27b4e04
MD5 128b1041da8c2323b3f18a86b60abdf9
BLAKE2b-256 99081388cc8e90f63497922dd7115cb828020d0ffe9657a14315d103e64bbee7

See more details on using hashes here.

File details

Details for the file moriarty-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: moriarty-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 54.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for moriarty-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 715cbda0e6f9c423308b34f1682ccb29c5bb85f9ef046beda829e92e003e47a2
MD5 10b373486206e960ae8d8f7c50b02bbd
BLAKE2b-256 fa55d87281470176640a789d736f79bbc57b467b4abb1c9b894d292cf151e4db

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page