Skip to main content

A tool to deploy mlops tooling at the click of a button.

Project description

MLInfra Github Banner

Open source MLOps infrastructure deployment on Public Cloud providers

Open source MLOps: Open source tools for different stages in an MLOps lifecycle.
Public Cloud Providers: Supporting all major cloud providers including AWS, GCP, Azure and Oracle Cloud

GitHub License mlinfra releases Documentation CI test status mlinfra Python package on PyPi mlinfra Python package downloads on PyPi Discord cloud providers AWS Examples GCP Examples Azure Examples Alibaba Examples

mlinfra is the swiss army knife for deploying MLOps tooling anywhere. It aims to make MLOps infrastructure deployment easy and accessible to all ML teams by liberating IaC logic for creating MLOps stacks which is usually tied to other frameworks.

Contribute to the project by opening a issue or joining project roadmap and design related discussion on discord. Complete roadmap will be released soon!

🚀 Installation

Requirements

mlinfra requires the following to run perfectly:

  • terraform >= 1.10.2 should be installed on the system.

mlinfra can be installed simply by creating a python virtual environment and installing mlinfra pip package

python -m venv .venv
source .venv/bin/activate
pip install mlinfra

Copy a deployment config from the examples folder, change your AWS account in the config file, configure your AWS credentials and deploy the configuration using

mlinfra terraform apply --config-file <path-to-your-config>

For more information, read the mlinfra user guide

Deployment Config

mlinfra deploys infrastructure using declarative approach. It requires resources to be defined in a yaml file with the following format

name: aws-mlops-stack
provider:
  name: aws
  account-id: xxxxxxxxx
  region: eu-central-1
deployment:
  type: cloud_vm # (this would create ec2 instances and then deploy applications on it)
stack:
  data_versioning:
    - lakefs # can also be pachyderm or lakefs or neptune and so on
  experiment_tracker:
    - mlflow # can be weights and biases or determined, or neptune or clearml and so on...
  orchestrator:
    - zenml # can also be argo, or luigi, or airflow, or dagster, or prefect or flyte or kubeflow or ray and so on...
  model_inference:
    - bentoml # can also be ray or KF serving or seldoncore or tf serving
  monitoring:
    - nannyML # can be grafana or alibi or evidently or neptune or prometheus or weaveworks and so on...
  alerting:
    - mlflow # can be mlflow or neptune or determined or weaveworks or prometheus or grafana and so on...
  • For examples, check out the documentation.

  • NOTE: This was minimal spec for aws cloud as infra with custom applications. Other stacks such as feature_store, event streamers, loggers or cost dashboards can be added via community requests. For more information, please check out the docs.

Supported Providers

The core purpose is to build for all cloud and deployment platforms out there. Any user should be able to just change the cloud provider or runtime environment (whether it be linux or windows) and have the capability to deploy the same tools.

mlinfra will be supporting the following providers:

Local machine (for development)

Cloud Providers (for deployment and production ready)

Supported deployment types

When deploying on managed cloud providers, users can deploy their infrastructure on top of either:

Supported MLOps Tools

mlinfra intends to support as many MLOps tools deployable in a platform in their standalone as well as high availability across different layers of an MLOps stack:

  • data_ingestion
  • data_versioning
  • data_processing
  • vector_database
  • experiment_tracker
  • orchestrator
  • model_inference
  • monitoring
  • alerting

Development

  • This project relies on terraform for IaC code and python to glue it all together.
  • To get started, install terraform and python.
  • You can install the required python packages by running uv sync
  • You can run any of the available examples from the examples/ folder by running the following command in root directory python src/mlinfra/cli/cli.py terraform <action> --config-file examples/<deployment-type>/<file>.yaml where <action> corresponds to terraform actions such as plan, apply and destroy.

For more information, please refer to the Engineering Wiki of the project (https://mlinfra.io/user_guide/) regarding what are the different components of the project and how they work together.

Contributions

  • Contributions are welcome! Help us onboard all of the available mlops tools on currently available cloud providers.
  • For major changes, please open an issue first to discuss what you would like to change. A team member will get to you soon.
  • For information on the general development workflow, see the contribution guide.

License

The mlinfra library is distributed under the Apache-2 license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlinfra-0.0.29.tar.gz (8.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mlinfra-0.0.29-py3-none-any.whl (151.6 kB view details)

Uploaded Python 3

File details

Details for the file mlinfra-0.0.29.tar.gz.

File metadata

  • Download URL: mlinfra-0.0.29.tar.gz
  • Upload date:
  • Size: 8.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mlinfra-0.0.29.tar.gz
Algorithm Hash digest
SHA256 0bb50c69188677b8c6d42520f36683613fcba8742ae6b4244b7298c910e6acba
MD5 10b697d4c53efce8796016ad5105373d
BLAKE2b-256 249c598125434e78e6ee4a8d9ffb4c3ed9e9249211f9ffc88e39f6e98709ff97

See more details on using hashes here.

File details

Details for the file mlinfra-0.0.29-py3-none-any.whl.

File metadata

  • Download URL: mlinfra-0.0.29-py3-none-any.whl
  • Upload date:
  • Size: 151.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mlinfra-0.0.29-py3-none-any.whl
Algorithm Hash digest
SHA256 d40aa324a56148b7a79800f0a9c42be842dc79b324edddba54fa488d9ff4c007
MD5 253a56d4d4ca0c5eaf774c1b395aab2b
BLAKE2b-256 19adfaae4a28cffd55957e46fef26ae5394fd18f4aa84dcecffb6c8e320dccf7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page