Skip to main content

MLchain Python Library

Project description



MLChain: Auto-Magical Deploy AI model at large scale, high performance, and easy to use

Explore the docs »

Our Website · Examples in Python

PyPI version Downloads Build Status codecov Documentation Status license

MLChain is a simple, easy to use library that allows you to deploy your Machine Learning model to hosting server easily and efficiently, drastically reducing the time required to build API that support an end-to-end AI product.

Key features

  • Fast: MLChain prioritize speed above other criteria.

  • Fast to code: With a finished Machine Learning model, it takes 4 minutes on average to deploy a fully functioning API with MLChain.

  • Flexible: The nature of ML-Chain allows developing end-to-end adaptive, with varied serializer and framework hosting at your choice.

  • Less debugging : We get it. Humans make mistakes. With MLChain, its configuration makes debugging a lot easier and almost unnecessary.

  • Easy to code: as a piece of cake!

  • Standards-based: Based on the open standards for APIs: OpenAPI (previously known as Swagger), along with JSON Schema and other options.

Installation

MLChain required Python 3.6 and above

PyPI

To install latest stable version of MLChain, simply run:

pip install mlchain

Build from source

If you can't wait for the next release, install the most up to date code with from master branch by running the following command:

git clone https://github.com/Techainer/mlchain-python
cd mlchain-python
pip install -r requirements.txt
python setup.py install

Or simply install using git:

pip install git+https://github.com/Techainer/mlchain-python@master --upgrade

Documentation

Read ours documentation here

Demo

Here's a minimal example of serving a dummy python class

Create a server.py file:

import cv2
import numpy as np
from mlchain.base import ServeModel


class Model():
  """ Just a dummy model """

  def predict(self, image: np.ndarray):
      """
      Resize input to 100 by 100.
      Args:
          image (numpy.ndarray): An input image.
      Returns:
          The image (np.ndarray) at 100 by 100.
      """
      image = cv2.resize(image, (100, 100))
      return image


# Define model
model = Model()

# Serve model
serve_model = ServeModel(model)

# Deploy model
if __name__ == '__main__':
    from mlchain.server import FlaskServer
    # Run flask model with upto 12 threads
    FlaskServer(serve_model).run(port=5000, threads=12)

Now run:

python server.py

And you should see something like this:

[mlchain-logger]:[7895] [2020-08-18 09:53:02 +0700]-[INFO]-[flask_server.py:424]---------------------------------------------------------------------------------
[mlchain-logger]:[7895] [2020-08-18 09:53:02 +0700]-[INFO]-[flask_server.py:425]-Served model with Flask at host=127.0.0.1, port=5000
[mlchain-logger]:[7895] [2020-08-18 09:53:02 +0700]-[INFO]-[flask_server.py:426]-Debug = False
[mlchain-logger]:[7895] [2020-08-18 09:53:02 +0700]-[INFO]-[flask_server.py:427]---------------------------------------------------------------------------------

Now you can access your API at http://localhost:5000

You can open Swagger UI at http://localhost:5000/swagger and try your API out right away

swagger

After explore all your API endpoint over there, create a client.py file:

import numpy as np
from mlchain.client import Client

model = Client(api_address='http://localhost:5000').model()
# Create a dummy input with shape (200, 200)
input_image = np.ones((200, 200), dtype=np.uint8)
# Then pass it through our client just like normal Python
result_image = model.predict(input_image)
print(result_image.shape)  # And the result should be (100, 100)

Now you have a supper simple Client to work with. Sooo easy :D

Examples

Asking for help

Welcome to the MLChain community!

If you have any questions, please feel free to:

  1. Read the docs
  2. Open an issues

We are happy to help

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlchain-0.1.9.tar.gz (977.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mlchain-0.1.9-py3-none-any.whl (1.0 MB view details)

Uploaded Python 3

File details

Details for the file mlchain-0.1.9.tar.gz.

File metadata

  • Download URL: mlchain-0.1.9.tar.gz
  • Upload date:
  • Size: 977.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.1.0 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.7.1

File hashes

Hashes for mlchain-0.1.9.tar.gz
Algorithm Hash digest
SHA256 2524766002d3ae4ccb6db3b121c8970330242190f154f4f105a63c16bc8d4a14
MD5 6e465f38be97f1634d292e03f4fe4e71
BLAKE2b-256 12a09fb7c11488f393bb051b7a83e6a86c2e97117d74d52af5d068462ac77846

See more details on using hashes here.

File details

Details for the file mlchain-0.1.9-py3-none-any.whl.

File metadata

  • Download URL: mlchain-0.1.9-py3-none-any.whl
  • Upload date:
  • Size: 1.0 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.1.0 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.7.1

File hashes

Hashes for mlchain-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 03c446203ab90100201fb003b177085a3b76dfbcb0fbdd21715501495d37ddee
MD5 9cab8742fe290d0f2674a10f93d43359
BLAKE2b-256 127a6eff8a1fcf725ffa6b86b5a9fddb65206dc1cb9036c675c8e4b741658dde

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page