Skip to main content

Deploying machine learning for Heartex or Label Studio

Project description


Python interface for running ML backend server and using it for active learning & prelabeling & prediction within Heartex platform


First make sure you have Redis server running (otherwise you can use only prediction, not active learning).

Install Heartex SDK:

git clone
cd pyheartex/
pip install -r requirements.txt
pip install -e .

Last thing you should do is to start RQ workers in the background:

rq worker default

Using Docker

Here is an example how to start serving image classifier:

cd examples/docker
docker-compose up

All you need to replace with your own model is to change loading, inference and training scripts from this file.

Quick start

Quick start guide provides the usage of the following popular machine learning frameworks within Heartex platform:


Let's serve scikit-learn model for text classification.

You can simply launch

python examples/

This script looks like

from htx.adapters.sklearn import serve

from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.linear_model import LogisticRegression
from sklearn.pipeline import make_pipeline

if __name__ == "__main__":

    # Creating sklearn-compatible model
    my_model = make_pipeline(TfidfVectorizer(), LogisticRegression())

    # Start serving this model

It starts serving at http://localhost:16118 listening for Heartex event. To connect your model, go to Heartex -> Settings -> Machine learning page and choose "Add custom model".

Or you can use Heartex API to activate your model:

curl -X POST -H 'Content-Type: application/json' \
-H 'Authorization: Token <PUT-YOUR-TOKEN-HERE>' \
-d '[{"url": "$HOST:$PORT", "name": "my_model", "title": "My model", "description": "My new model deployed on Heartex"}]' \{project-id}/backends/

where $HOST:$PORT is your server URL that should be accessible from the outside.


You can integrate FastAI models similarly to scikit-learn. Check this example to learn how to plug in updateable image classifier.

Advanced usage

When you want to go beyond using sklearn compatible API, you can build your own model, by making manually input/output interface conversion. You have to subclass Heartex models as follows:

from htx.base_model import BaseModel

# This class exposes methods needed to handle model in the runtime (loading into memory, running predictions)
class MyModel(BaseModel):

    def get_input(self, task):
        """Extract input from serialized task"""

    def get_output(self, task):
        """Extract output from serialized task"""

    def load(self, train_output):
        """Loads model into memory. `train_output` dict is actually the output the `train` method (see below)"""

    def predict(self, tasks):
        """Get list of tasks, already processed by `get_input` method, and returns completions in Heartex format"""

# This method handles model retraining
def train(input_tasks, output_model_dir, **kwargs):
    :param input_tasks: list of tasks already processed by `get_input`
    :param output_model_dir: output directory where you can optionally store model resources
    :param kwargs: any additional kwargs taken from `train_kwargs`
    :return: `train_output` dict for consequent model loading

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyheartex-0.0.8.tar.gz (14.2 kB view hashes)

Uploaded source

Built Distribution

pyheartex-0.0.8-py3-none-any.whl (15.9 kB view hashes)

Uploaded py3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page