Inference service package for IAPARC
Project description
iaparc_inference
The IA Parc inference plugin allows developers to easily integrate their inference pipeline into IA Parc's production module.
Installation
pip install iaparc-inference
Usage
-
If your inference pipeline support batching:
from iaparc_inference import IAPListener # Define a callback to query your inference pipeline # To load your model only once it is recommended to use a class: class MyModel: def __init__(self, model_path: str): ## Load your model in pytorch, tensorflow or any other backend def batch_query(batch: list, parameters: Optional) -> list: ''' execute your pipeline on a batch input Note: "parameters" is an optional argument. It can be used to handle URL's query parameters It's a list of key(string)/value(string) dictionaries ''' if __name__ == '__main__': # Initiate your model class my_model = MyModel("path/to/my/model") # Initiate IAParc listener listener = IAPListener(my_model.batch_query) # Start the listener listener.run()
-
If your inference pipeline do not support batching:
from iaparc_inference import IAPListener # Define a callback to query your inference pipeline # To load your model only once it is recommended to use a class: class MyModel: def __init__(self, model_path: str): ## Load your model in pytorch, tensorflow or any other backend def single_query(one_input, parameters: Optional): ''' execute your pipeline on a single input Note: "parameters" is an optional argument. It can be used to handle URL's query parameters It's a key(string)/value(string) dictionary ''' if __name__ == '__main__': # Initiate your model class my_model = MyModel("path/to/my/model") # Initiate IAParc listener listener = IAPListener(my_model.single_query, batch=1) # Note that batch size is forced to 1 here # Start the listener listener.run()
Features
- Dynamic batching
- Autoscalling
- Support both synchronous and asynchronous queries
- Data agnostic
License
This project is licensed under the Apache License Version 2.0 - see the Apache LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file iaparc_inference-0.5.6.tar.gz.
File metadata
- Download URL: iaparc_inference-0.5.6.tar.gz
- Upload date:
- Size: 16.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0f996950eda436d1070e8292376f82ac3e4f8d98e1c22791da82f884bb046984
|
|
| MD5 |
e69e6b3ffbf218654a621c5b4633c328
|
|
| BLAKE2b-256 |
d60ae895fc1b127fa7ad19bc5a53cfdf9f5e65917b14966cf54c191649049872
|
File details
Details for the file iaparc_inference-0.5.6-py3-none-any.whl.
File metadata
- Download URL: iaparc_inference-0.5.6-py3-none-any.whl
- Upload date:
- Size: 18.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e653bf63147e43774592e1aa3736e5c28f18cfcd23f0098eef1932ca705a6438
|
|
| MD5 |
6ad224bf8f50f88a7bd427b9c4333404
|
|
| BLAKE2b-256 |
70e05847c7cd1c950791ffe79da88e1a0f627c6d019e09c66e951593ec55cda0
|