Skip to main content

A minimalist framework for online deployment of sklearn-like models

Project description

modelib

A minimalist framework for online deployment of sklearn-like models

Package version Code style: black Semantic Versions License

Installation

pip install modelib

Usage

The modelib package provides a simple interface to deploy and serve models online. The package is designed to be used with the fastapi package, and supports serving models that are compatible with the sklearn package.

First, you will need to create a model that is compatible with the sklearn package. For example, let's create a simple RandomForestClassifier model with a StandardScaler preprocessor:

MODEL = Pipeline(
    [
        ("scaler", StandardScaler()),
        ("clf", RandomForestClassifier(random_state=42)),
    ]
).set_output(transform="pandas")

Let's assume that you have a dataset with the following columns:

request_model = [
    {"name": "sepal length (cm)", "dtype": "float64"},
    {"name": "sepal width (cm)", "dtype": "float64"},
    {"name": "petal length (cm)", "dtype": "float64"},
    {"name": "petal width (cm)", "dtype": "float64"},
]

Alternatively, you can use a pydantic model to define the request model, where the alias field is used to match the variable names with the column names in the training dataset:

class InputData(pydantic.BaseModel):
    sepal_length: float = pydantic.Field(alias="sepal length (cm)")
    sepal_width: float = pydantic.Field(alias="sepal width (cm)")
    petal_length: float = pydantic.Field(alias="petal length (cm)")
    petal_width: float = pydantic.Field(alias="petal width (cm)")

request_model = InputData

After the model is created and trained, you can create a modelib runner for this model as follows:

import modelib as ml

simple_runner = ml.SklearnRunner(
    name="my simple model",
    predictor=MODEL,
    method_name="predict",
    request_model=request_model,
)

Another option is to use the SklearnPipelineRunner class which allows you to get all the outputs of the pipeline:

pipeline_runner = ml.SklearnPipelineRunner(
    "Pipeline Model",
    predictor=MODEL,
    method_names=["transform", "predict"],
    request_model=request_model,
)

Now you can create a FastAPI app with the runners:

app = ml.init_app(runners=[simple_runner, pipeline_runner])

You can also pass an existing FastAPI app to the init_app function:

import fastapi

app = fastapi.FastAPI()

app = ml.init_app(app=app, runners=[simple_runner, pipeline_runner])

The init_app function will add the necessary routes to the FastAPI app to serve the models. You can now start the app with:

uvicorn <replace-with-the-script-filename>:app --reload

After the app is running you can check the created routes in the Swagger UI at the /docs endpoint.

Swagger UI

The created routes expect a JSON payload with the features as keys and the values as the input to the model. For example, to make a prediction with the simple model runner you can send a POST request to the /my-simple-model endpoint with the following payload:

{
  "sepal length (cm)": 5.1,
  "sepal width (cm)": 3.5,
  "petal length (cm)": 1.4,
  "petal width (cm)": 0.2
}

The response will be a JSON with the prediction:

{
  "result": 0
}

Contributing

If you want to contribute to the project, please read the CONTRIBUTING.md file for more information.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

modelib-0.3.0a1.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

modelib-0.3.0a1-py3-none-any.whl (10.3 kB view details)

Uploaded Python 3

File details

Details for the file modelib-0.3.0a1.tar.gz.

File metadata

  • Download URL: modelib-0.3.0a1.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.10 Linux/6.8.0-1014-azure

File hashes

Hashes for modelib-0.3.0a1.tar.gz
Algorithm Hash digest
SHA256 afd597dacaca377fbd4940ffc38f4fb2c00867bb8097941b1b32578a8ddf53d9
MD5 ca0dd2de0e06ed6727c4549e922f9d56
BLAKE2b-256 9fbb44ef4376c61868d9db926cf4001203d08c069f7f1d9c1ba08ea08fab35fa

See more details on using hashes here.

File details

Details for the file modelib-0.3.0a1-py3-none-any.whl.

File metadata

  • Download URL: modelib-0.3.0a1-py3-none-any.whl
  • Upload date:
  • Size: 10.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.10 Linux/6.8.0-1014-azure

File hashes

Hashes for modelib-0.3.0a1-py3-none-any.whl
Algorithm Hash digest
SHA256 f4d0a4a70b547f3a38a628739db1d2376994299fc6ba7ee13bf04fbb189899f5
MD5 be8d16c456fa63e0879612f2c89823d0
BLAKE2b-256 6edd0971a7de2bc19921b5aab3c9ae8eae6cda4f380505dae41ef1a4e04b2ee3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page