Skip to main content

Real-time inference pipelines

Project description

fast

Just give me the dataz!

Welcome! Leapy is a library for real-time, sub-millisecond inference; it provides customizable machine learning pipeline export for fast model serving. These pipelines are targeted for using Dask's scalable machine learning, which follows the Scikit-Learn API. However, you can use this framework directly with Scikit-Learn as well.

Leapy is inspired by MLeap.

Benefits

Dask is a Python distributed computing environment in Python with a burgeoning machine learning component, compatible with Scikit-Learn's API. Using Leapy's framework, we can serve these pipelines in real-time!

This means:

  • No JVM: No reliance on JVM from using Spark.
  • Python: All Python development and custom transformers -- no Scala & Java needed!
  • Scale: Scikit-Learn logic and pipelines scaled to clusters.
  • Fast: You're in control of how fast your transformers are!
  • Simple: Easily build and deploy models with Docker.
  • Reliable: Encourages a test-driven approach.

Examples

  • Simple -- Super simple example of creating, testing, and using custom transformers
  • XGBoost -- Advanced example of using XGBoost with Dask, saving, and serving the model.

Benchmarks

A simple example of what we're going for -- computing a one-hot-encoding, with ~200K labels, of a single data point (dask array x_da and numpy array x = x_da.compute()):

sample benchmark

Where ohe_dml (from dask_ml) and ohe (from leapy) are essentially the same; ohe_sk is from scikit-learn and ohe_runtime is from ohe.to_runtime(). And, running compute() on Dask transforms above bumps the time up to about 1 second.

With the time we save using ohe_runtime, we can squeeze in many more transformations and an estimator to still come in under 1ms.

Example Usage

Start with a dataset in dask arrays, X and y, and create a Scikit-Learn pipeline:

from dask_ml.linear_model import LogisticRegression
from leapy.dask.transformers import OneHotEncoder
from leapy.dask.pipeline import FeaturePipeline

pipe = Pipeline([
        ('fp', FeaturePipeline([('ohe',
                                 OneHotEncoder(sparse=False),
                                 [0, 1])])),
        ('clf', LogisticRegression())
])

pipe.fit(X, y)

Then we export to a runtime pipeline, and save:

pipe_runtime = pipe.to_runtime()

with open('pipe_runtime.pkl', 'wb') as f:
    pickle.dump(pipe_runtime, f)

This model is ready to be served! Docker

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

leapy-0.0.3.tar.gz (13.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

leapy-0.0.3-py3-none-any.whl (20.2 kB view details)

Uploaded Python 3

File details

Details for the file leapy-0.0.3.tar.gz.

File metadata

  • Download URL: leapy-0.0.3.tar.gz
  • Upload date:
  • Size: 13.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.2

File hashes

Hashes for leapy-0.0.3.tar.gz
Algorithm Hash digest
SHA256 0f023f28ba4e546b2e9066a1c2d3ba68fe119b3f2b2854ff41fc44161bc79ac9
MD5 64288db85895bf62b1d1fe611e833617
BLAKE2b-256 0f8c86407433706e84e22eaeb71ec9a60787f75b7ddc7da8f9d023c626126727

See more details on using hashes here.

File details

Details for the file leapy-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: leapy-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 20.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.2

File hashes

Hashes for leapy-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a4a64cec1458298dfd49748ed85e394ae524f808aea79c37da746511d4bcb373
MD5 74077f375afd35cdd335f9d9c513848a
BLAKE2b-256 5288198f3fe2a29c83d7276920fde78aba6aa5471a324e12037bccda8a328fb9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page