Skip to main content

put your model into **a bottle** then you get a working server and more.

Project description

abottle

trition/tensorrt/onnxruntim/pytorch python server wrapper

put your model into a bottle then you get a working server and more.

Demo

import numpy as np
from transformers import AutoTokenizer


class MiniLM:
    def __init__(self):
        self.tokenizer = AutoTokenizer.from_pretrained("sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2")

    def predict(self, X):
        encode_dict = self.tokenizer(
            X, padding="max_length", max_length=128, truncation=True
        )
        input_ids = np.array(encode_dict["input_ids"], dtype=np.int32)
        attention_mask = np.array(encode_dict["attention_mask"], dtype=np.int32)

        outputs = self.model.infer(
            {"input_ids": input_ids, "attention_mask": attention_mask}, ["y"]
        )

        return outputs['y']


    #you can write config in class or provide it as a yaml file or yaml string
    class Config:
        class model:
            name = "minilm"
            version = "2"

you can write a class like this, and then starts with abottle

abottle main.MiniLM

config with shell

abottle main.MiniLM file_path=test_data.txt batch_size=100 --as tester --config """TritonModel:
        triton_url: localhost
        name: minilm
        version: 2
    """

config with file

abottle main.MiniLM file_path=test_data.txt batch_size=100 --as tester --config <config yaml file path>

you can get a http server run at localhost:8081 with a POST url /infer, where your predict function will be called, the X is the json decode content, self.model in your class is a trition client wrapper with a function infer which takes a dictionary as input and a list of str as output

this code is shit, use it carefully.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

abottle-0.0.3.tar.gz (5.6 kB view details)

Uploaded Source

File details

Details for the file abottle-0.0.3.tar.gz.

File metadata

  • Download URL: abottle-0.0.3.tar.gz
  • Upload date:
  • Size: 5.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.8.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for abottle-0.0.3.tar.gz
Algorithm Hash digest
SHA256 28b3ba0c3ffc5c8e868777f491bda92ddd51802a1ee5788e7e5ef799850d1873
MD5 66879f8114b97754a05c6491607080f7
BLAKE2b-256 9e4cb8d4da4c0926cabec123d95cac430f272e0be9bf2efd1d576aededdfedac

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page