Machine Learning Operations Toolkit
Project description
⏳ Tempo: The MLOps Software Development Kit
Vision
Enable data scientists to see a productionised machine learning model within moments, not months. Easy to work with locally and also in kubernetes, whatever your preferred data science tools
Overview
Tempo provides a unified interface to multiple MLOps projects that enable data scientists to deploy and productionise machine learning systems.
Motivating Example
Tempo allows you to interact with scalable orchestration engines like Seldon Core and KFServing, and leverage a broad range of machine learning services like TFserving, Triton, MLFlow, etc.
sklearn_model = Model(
name="test-iris-sklearn",
platform=ModelFramework.SKLearn,
uri="gs://seldon-models/sklearn/iris")
sklearn_model = Model(
name="test-iris-sklearn",
platform=ModelFramework.SKLearn,
uri="gs://seldon-models/sklearn/iris")
@pipeline(name="mypipeline",
uri="gs://seldon-models/custom",
models=[sklearn_model, xgboost_model])
class MyPipeline(object):
@predictmethod
def predict(self, payload: np.ndarray) -> np.ndarray:
res1 = sklearn_model(payload)
if res1[0][0] > 0.7:
return res1
else:
return xgboost_model(payload)
my_pipeline = MyPipeline()
# Deploy only the models into kubernetes
my_pipeline.deploy_models()
my_pipeline.wait_ready()
# Run the request using the local pipeline function but reaching to remote models
my_pipeline.predict(np.array([[4.9, 3.1, 1.5, 0.2]]))
Productionisation Workflows
Declarative Interface
Even though Tempo provides a dynamic imperative interface, it is possible to convert into a declarative representation of components.
yaml = my_pipeline.to_k8s_yaml()
print(yaml)
Environment Packaging
You can also manage the environments of your pipelines to introduce reproducibility of local and production environments.
@pipeline(name="mypipeline",
uri="gs://seldon-models/custom",
conda_env="tempo",
models=[sklearn_model, xgboost_model])
class MyPipeline(object):
# ...
my_pipeline = MyPipeline()
# Save the full conda environment of the pipeline
my_pipeline.save()
# Upload the full conda environment
my_pipeline.upload()
# Deploy the full pipeline remotely
my_pipeline.deploy()
my_pipeline.wait_ready()
# Run the request to the remote deployed pipeline
my_pipeline.remote(np.array([[4.9, 3.1, 1.5, 0.2]]))
Examples
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file mlops-tempo-0.1.0.dev6.tar.gz
.
File metadata
- Download URL: mlops-tempo-0.1.0.dev6.tar.gz
- Upload date:
- Size: 24.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.57.0 CPython/3.7.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4aa3130018c9e01a55e39af1921ee6ea489975510b396981e20dc9732bb0cae8 |
|
MD5 | 276bcc7ef269f72f0b1b56817df63242 |
|
BLAKE2b-256 | 467b8a0bb513a59f2602be019f3f0188712b9b56491006c77166e6d979d88c98 |
File details
Details for the file mlops_tempo-0.1.0.dev6-py3-none-any.whl
.
File metadata
- Download URL: mlops_tempo-0.1.0.dev6-py3-none-any.whl
- Upload date:
- Size: 39.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.57.0 CPython/3.7.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ab7a79b10270255577576e66897a34ba2db30c5de38a94a8f330df45d1e3d9ca |
|
MD5 | e4bbfd5f7f523000281ed684a4a77259 |
|
BLAKE2b-256 | b8e4cf48d7484baba3786cb2968616c77ba300b3162a75365a7dbd43c2f30944 |