Skip to main content

moriarty

Project description

codecov

moriarty

Moriarty is a set of components for building asynchronous inference cluster.

Relying on cloud vendors or self-built global queue services, asynchronous inference clusters can be built without exposing ports to the public.

Why asynchronous inference, why moriarty?

  • Preventing client timeout.
  • Avoid HTTP disconnection due to network or other issues.
  • Reducing HTTP queries with queues.
  • Deploy on Multi/Hybrid/Private cloud, even on bare metal.

Alternatives

This project came from my deep use of Asynchronous Inferenc for AWS Sagemaker, and as far as I know, only AWS and Aliyun provide asynchronous inference support.

For open source projects, there are many deployment solutions, but most of them are synchronous inference (based on HTTP or RPC).I don't find any alternative for async inference. Maybe Kubeflow pipeline can be used for asynchronous inference. But without serving support(Leave model in GPU as a service, not load per job), there is a significant overhead of GPU memory cache and model load time.

Architecture Overview

Architecture Overview

Key Components:

  • Matrix: single producer, multiple consumers. Connector as producer, provide HTTP API for Backend Service and push invoke request to the global Job Queue. Operator as consumer, pull tasks from the Job Queue and push them to local queue. Pulling or not depends on the load of inference cluster. And also, Operator will autoscale inference container if needed.
  • Endpoint: Deploy a function as an HTTP service.
  • Sidecar: Proxy and transform queue message into HTTP request.
  • Init: Init script for inference container

CLIs:

  • moriarty-matrix: Manager matrix components
  • moriarty-operator: Start the operator component
  • moriarty-connector: Start the connector component
  • moriarty-sidecar: Start the sidecar component
  • moriarty-deploy: Request operator's API or database for deploy inference endpoint.

Install

pip install moriarty[matrix] for all components.

Or use docker image

docker pull wh1isper/moriarty

docker pull wh1isper/moriarty:dev for developing version

Develop

Install pre-commit before commit

pip install pre-commit
pre-commit install

Install package locally with test dependencies

pip install -e .[test]

Run tests with pytest

pytest -v tests/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

moriarty-0.0.8.tar.gz (185.7 kB view details)

Uploaded Source

Built Distribution

moriarty-0.0.8-py3-none-any.whl (57.5 kB view details)

Uploaded Python 3

File details

Details for the file moriarty-0.0.8.tar.gz.

File metadata

  • Download URL: moriarty-0.0.8.tar.gz
  • Upload date:
  • Size: 185.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for moriarty-0.0.8.tar.gz
Algorithm Hash digest
SHA256 4d3e6cc45854de31a027fb85b021bc0557fab8f0db27f92375668da46d2ae9fa
MD5 da3a13f80954f3475b5ef9c74b3d7524
BLAKE2b-256 99d3c4be00521d6a1cd22376a678bd37aa9b4b71ee64ee0fac7aeaba9c905a57

See more details on using hashes here.

File details

Details for the file moriarty-0.0.8-py3-none-any.whl.

File metadata

  • Download URL: moriarty-0.0.8-py3-none-any.whl
  • Upload date:
  • Size: 57.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for moriarty-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 10e84efb3080bef9c449e891539a2f5861540defaa34d4f5aa4318909bd2394b
MD5 930c556c7e5c87df393d55472c028420
BLAKE2b-256 653712129b8a4fdd632d945268a57e45e8b31d80b703089be767f20979b58c30

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page