Skip to main content

Machine Learning Operations Toolkit

Project description

GitHub GitHub GitHub

⏳ Tempo: The MLOps Software Development Kit

Vision

Enable data scientists to see a productionised machine learning model within moments, not months. Easy to work with locally and also in kubernetes, whatever your preferred data science tools

Documentation

Highlights

Tempo provides a unified interface to multiple MLOps projects that enable data scientists to deploy and productionise machine learning systems.

  • Package your trained model artifacts to optimized server runtimes (Tensorflow, PyTorch, Sklearn, XGBoost etc)
  • Package custom business logic to production servers.
  • Build an inference pipeline of models and orchestration steps.
  • Include any custom python components as needed. Examples:
    • Outlier detectors with Alibi-Detect.
    • Explainers with Alibi-Explain.
  • Test Locally - Deploy to Production
    • Run with local unit tests.
    • Deploy locally to Docker to test with Docker runtimes.
    • Deploy to production on Kubernetes
    • Extract declarative Kubernetes yaml to follow GitOps workflows.
  • Supporting a wide range of production runtimes
    • Seldon Core open source
    • KFServing open source
    • Seldon Deploy enterprise
  • Create stateful services. Examples:
    • Multi-Armed Bandits.

Workflow

  1. Develop locally.
  2. Test locally on Docker with production artifacts.
  3. Push artifacts to remote bucket store and launch remotely (on Kubernetes).

overview

Motivating Synopsis

Data scientists can easily test their models and orchestrate them with pipelines.

Below we see two Models (sklearn and xgboost) with a function decorated pipeline to call both.

def get_tempo_artifacts(artifacts_folder: str) -> Tuple[Pipeline, Model, Model]:

    sklearn_model = Model(
        name="test-iris-sklearn",
        platform=ModelFramework.SKLearn,
        local_folder=f"{artifacts_folder}/{SKLearnFolder}",
        uri="s3://tempo/basic/sklearn",
    )

    xgboost_model = Model(
        name="test-iris-xgboost",
        platform=ModelFramework.XGBoost,
        local_folder=f"{artifacts_folder}/{XGBoostFolder}",
        uri="s3://tempo/basic/xgboost",
    )

    @pipeline(
        name="classifier",
        uri="s3://tempo/basic/pipeline",
        local_folder=f"{artifacts_folder}/{PipelineFolder}",
        models=PipelineModels(sklearn=sklearn_model, xgboost=xgboost_model),
    )
    def classifier(payload: np.ndarray) -> Tuple[np.ndarray, str]:
        res1 = classifier.models.sklearn(input=payload)

        if res1[0] == 1:
            return res1, SKLearnTag
        else:
            return classifier.models.xgboost(input=payload), XGBoostTag

    return classifier, sklearn_model, xgboost_model

Save the pipeline code.

from tempo.serve.loader import save
save(classifier)

Deploy to docker.

from tempo.seldon.docker import SeldonDockerRuntime
docker_runtime = SeldonDockerRuntime()
docker_runtime.deploy(classifier)
docker_runtime.wait_ready(classifier)

Make predictions on containerized servers that would be used in production.

classifier.remote(payload=np.array([[1, 2, 3, 4]]))

Deploy to Kubernetes for production.

from tempo.serve.metadata import RuntimeOptions, KubernetesOptions
runtime_options = RuntimeOptions(
        k8s_options=KubernetesOptions(
            namespace="production",
            authSecretName="minio-secret"
        )
    )

from tempo.seldon.k8s import SeldonKubernetesRuntime
k8s_runtime = SeldonKubernetesRuntime(runtime_options)
k8s_runtime.deploy(classifier)
k8s_runtime.wait_ready(classifier)

This is an extract from the introduction demo.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlops-tempo-0.1.0.dev11.tar.gz (30.6 kB view details)

Uploaded Source

Built Distribution

mlops_tempo-0.1.0.dev11-py3-none-any.whl (54.3 kB view details)

Uploaded Python 3

File details

Details for the file mlops-tempo-0.1.0.dev11.tar.gz.

File metadata

  • Download URL: mlops-tempo-0.1.0.dev11.tar.gz
  • Upload date:
  • Size: 30.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.10

File hashes

Hashes for mlops-tempo-0.1.0.dev11.tar.gz
Algorithm Hash digest
SHA256 ee1cd8bc2eb1de8e677cfa182ad2aa04e754e459e0189cb4a20f56a43d068707
MD5 bfc7d869b0f0cf30e1c1512f73c21752
BLAKE2b-256 0949e1da69d8e6c47f8a8be92533c6b582745dc8397b83a26cb902d09fec5e5d

See more details on using hashes here.

File details

Details for the file mlops_tempo-0.1.0.dev11-py3-none-any.whl.

File metadata

  • Download URL: mlops_tempo-0.1.0.dev11-py3-none-any.whl
  • Upload date:
  • Size: 54.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.10

File hashes

Hashes for mlops_tempo-0.1.0.dev11-py3-none-any.whl
Algorithm Hash digest
SHA256 e7eb17f4a9969417065724c461043106633457b9906f20ca3313d5fc377c7b12
MD5 4f0f055a1a069494568386312e6cd5b0
BLAKE2b-256 e74939797f5dde52c1994fd6a0e1de81e52bc8557062ec563ec6a0897249f7a1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page