Skip to main content

minimalistic ML-models auto deployment tool

Project description

DeployMe

Codacy grade GitHub branch checks state Codecov

PyPI - Python Version

If you have been working on ML models, then you have probably faced the task of deploying these models. Perhaps you are participating in a hackathon or want to show your work to management.

According to our survey, more than 60% of the data-scientists surveyed faced this task and more than 60% of the respondents spent more than half an hour creating such a service.

The most common solution is to wrap it in some kind of web framework (like Flask).

Our team believes that it can be made even easier!

Our tool automatically collects all the necessary files and dependencies, creates a docker container, and launches it! And all this in one line of source code.

Pipeline

  1. First, we initialize the project directory for the next steps;
  2. Next, we serialize your machine learning models (for example, with Joblib or Pickle);
  3. Next, we create a final .py file based on the templates that contains the endpoint handlers. Handlers are chosen based on models, and templates based on your preferences (templates are also .py files using, for example, Sanic or Flask);
  4. Copy or additionally generate the necessary files (e.g. Dockerfile);
  5. The next step is to compile the API documentation for your project;
  6. After these steps, we build a Docker container, or a Python package, or we just leave the final directory and then we can deploy your project in Kubernetes, or in Heroku.

Prerequisites

On your PC with local run you must have Docker & Python >= 3.8

Installation

Install deployme with pip:

pip install deployme

or with your favorite package manager.

Example

from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier

from deployme import cook

X, y = load_iris(return_X_y=True, as_frame=True)

clf = RandomForestClassifier()
clf.fit(X, y)

cook(strategy="docker", model=clf, port=5010)

After running script you can see new Docker container. To interact with service simply open URL, logged after script running.

On this page you can see Swagger UI, test simple requests (examples included). For direct post-requests you can use Curl:

curl -X POST "http://127.0.0.1:5001/predict" -H  "accept: application/json" -H  "Content-Type: application/json" -d "{\"data\":[{\"sepal length (cm)\":5.8,\"sepal width (cm)\":2.7,\"petal length (cm)\":3.9,\"petal width (cm)\":1.2}]}"

Models support

Currently, we support the following models:

  • sklearn
  • xgboost
  • catboost
  • lightgbm

RoadMap

  1. Deploy to Heroku & clusters
  2. Model's basic vizualization
  3. Tighter integration with LightAutoML
  4. Support many popular ML-frameworks, such as XGBoost, TensorFlow, CatBoost, etc.
  5. Your ideas!

Contribution

We are always open to your contributions! Please check our issue's and make PR.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deployme-0.5.0.tar.gz (39.2 kB view details)

Uploaded Source

Built Distribution

deployme-0.5.0-py3-none-any.whl (52.2 kB view details)

Uploaded Python 3

File details

Details for the file deployme-0.5.0.tar.gz.

File metadata

  • Download URL: deployme-0.5.0.tar.gz
  • Upload date:
  • Size: 39.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.9.2 readme-renderer/37.3 requests/2.28.1 requests-toolbelt/0.10.1 urllib3/1.26.13 tqdm/4.64.1 importlib-metadata/5.1.0 keyring/23.11.0 rfc3986/2.0.0 colorama/0.4.6 CPython/3.10.8

File hashes

Hashes for deployme-0.5.0.tar.gz
Algorithm Hash digest
SHA256 79400e648d3ec825c79e1594f4bd502dfef6979dda37e01e09fdec43091560b5
MD5 5dda685303399999b9ab63a4a9809f6a
BLAKE2b-256 23bce777937bbbe6d0a85e173eff71e8e734838732d76a63bcd324af10dda227

See more details on using hashes here.

File details

Details for the file deployme-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: deployme-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 52.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.9.2 readme-renderer/37.3 requests/2.28.1 requests-toolbelt/0.10.1 urllib3/1.26.13 tqdm/4.64.1 importlib-metadata/5.1.0 keyring/23.11.0 rfc3986/2.0.0 colorama/0.4.6 CPython/3.10.8

File hashes

Hashes for deployme-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e6e249328f9ecb69a44f36293e87f444c62efd720ef3cc2c13ae4b5909f2a5e4
MD5 0fca3ef50e54b97305b2841b71f70102
BLAKE2b-256 29814659d3477bd78138c4b438deb4480ce344216e093f03b87fbfbd20990575

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page