Skip to main content

Inference service package for IAPARC

Project description

iaparc_inference

PyPI version PyPI - License

The IAParc inference plugin allows developers to easily integrate their inference pipeline into the IAParc production module.

Installation

pip install iaparc-inference

Usage

  • If your inference pipeline support batching:

    from iaparc_inference import IAPListener
    
    # Define a callback to query your inference pipeline
    # To load your model only once it is recommended to use a class:
    class MyModel:
        def __init__(self, model_path: str):
            ## Load your model in pytorch, tensorflow or any other backend
        
        def batch_query(batch: list) -> list:
            ## execute your pipeline on a batch input
    
    if __name__ == '__main__':
        # Initiate your model class
        my_model = MyModel("path/to/my/model")
    
        # Initiate IAParc listener
        listener = IAPListener(my_model.batch_query)
        # Start the listener
        listener.run()
    
  • If your inference pipeline do not support batching:

    from iaparc_inference import IAPListener
    
    # Define a callback to query your inference pipeline
    # To load your model only once it is recommended to use a class:
    class MyModel:
        def __init__(self, model_path: str):
            ## Load your model in pytorch, tensorflow or any other backend
        
        def single_query(one_input):
            ## execute your pipeline on a single input
    
    if __name__ == '__main__':
        # Initiate your model class
        my_model = MyModel("path/to/my/model")
    
        # Initiate IAParc listener
        listener = IAPListener(my_model.single_query, batch=1)
        # Start the listener
        listener.run()
    

Features

  • Dynamic batching
  • Autoscalling
  • Support both synchronous and asynchronous queries
  • Data agnostic

License

This project is licensed under the Apache License Version 2.0 - see the Apache LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iaparc_inference-0.0.2.tar.gz (11.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

iaparc_inference-0.0.2-py3-none-any.whl (212.8 kB view details)

Uploaded Python 3

File details

Details for the file iaparc_inference-0.0.2.tar.gz.

File metadata

  • Download URL: iaparc_inference-0.0.2.tar.gz
  • Upload date:
  • Size: 11.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.13

File hashes

Hashes for iaparc_inference-0.0.2.tar.gz
Algorithm Hash digest
SHA256 81919b105fd25014d92eb2523029a099e81c9078d47b9f6337313e5977a352cf
MD5 1bc006a74573547476dd60ff524d8101
BLAKE2b-256 75b2eae333db0871128e78f40efb8b7845c4659e4f23ac70da6094348cd780fb

See more details on using hashes here.

File details

Details for the file iaparc_inference-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for iaparc_inference-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 27bb57c8cff8e030f49321e4184a2f870ec6780fc570764da8af961aab06b97a
MD5 3f198feb3839f94352481ddede97424b
BLAKE2b-256 c7b80536ded9538351d90fdd63cc8e568ec22be34c48e6fa7d4c6f610814b793

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page