Skip to main content

moriarty

Project description

codecov

moriarty

Moriarty is a set of components for building asynchronous inference cluster.

Relying on cloud vendors or self-built global queue services, asynchronous inference clusters can be built without exposing ports to the public.

Why asynchronous inference, why moriarty?

  • Preventing client timeout.
  • Avoid HTTP disconnection due to network or other issues.
  • Reducing HTTP queries with queues.
  • Deploy on Multi/Hybrid/Private cloud, even on bare metal.

Alternatives

This project came from my deep use of Asynchronous Inferenc for AWS Sagemaker, and as far as I know, only AWS and Aliyun provide asynchronous inference support.

For open source projects, there are many deployment solutions, but most of them are synchronous inference (based on HTTP or RPC).I don't find any alternative for async inference. Maybe Kubeflow pipeline can be used for asynchronous inference. But without serving support(Leave model in GPU as a service, not load per job), there is a significant overhead of GPU memory cache and model load time.

Architecture Overview

Architecture Overview

Key Components:

  • Matrix: single producer, multiple consumers. Connector as producer, provide HTTP API for Backend Service and push invoke request to the global Job Queue. Operator as consumer, pull tasks from the Job Queue and push them to local queue. Pulling or not depends on the load of inference cluster. And also, Operator will autoscale inference container if needed.
  • Endpoint: Deploy a function as an HTTP service.
  • Sidecar: Proxy and transform queue message into HTTP request.
  • Init: Init script for inference container

CLIs:

  • moriarty-matrix: Manager matrix components
  • moriarty-operator: Start the operator component
  • moriarty-connector: Start the connector component
  • moriarty-sidecar: Start the sidecar component
  • moriarty-deploy: Request operator's API or database for deploy inference endpoint.

Install

pip install moriarty[matrix] for all components.

Or use docker image

docker pull wh1isper/moriarty

docker pull wh1isper/moriarty:dev for developing version

Develop

Install pre-commit before commit

pip install pre-commit
pre-commit install

Install package locally with test dependencies

pip install -e .[test]

Run tests with pytest

pytest -v tests/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

moriarty-0.0.12.tar.gz (185.7 kB view details)

Uploaded Source

Built Distribution

moriarty-0.0.12-py3-none-any.whl (57.6 kB view details)

Uploaded Python 3

File details

Details for the file moriarty-0.0.12.tar.gz.

File metadata

  • Download URL: moriarty-0.0.12.tar.gz
  • Upload date:
  • Size: 185.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for moriarty-0.0.12.tar.gz
Algorithm Hash digest
SHA256 f689f02bfa0ec5f6a0d157e4c9cd229d9b624fb46251d297db62f3fd18487f4f
MD5 f8254326bcc18da02f3eab7b60757839
BLAKE2b-256 690bf6311366a0353c97041c9c4feb6fd2b0f2e3e22e4130dfca80f44a12c1fc

See more details on using hashes here.

File details

Details for the file moriarty-0.0.12-py3-none-any.whl.

File metadata

  • Download URL: moriarty-0.0.12-py3-none-any.whl
  • Upload date:
  • Size: 57.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for moriarty-0.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 c5e7c99361269e3e26eb4e72ebe24986ec80f33425fc4b74e7319ced5b0769f8
MD5 f7cd25d50beff74262574fe63a3e55fa
BLAKE2b-256 8e8111e11e6940e8a3b9ba2f098e4ee0fa4edb2063ffbc06788c61c32b5a4ed7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page