Skip to main content

A unified interface to run inference on machine learning libraries.

Project description

Python PyPI version Downloads License

Why x.infer?

If you'd like to run many models from different libraries without having to rewrite your inference code, x.infer is for you. It has a simple API and is easy to extend. Currently supports Transformers, Ultralytics, and TIMM.

Have a custom model? Create a class that implements the BaseModel interface and register it with x.infer. See Adding New Models for more details.

Key Features

x.infer
  • Unified Interface: Interact with different machine learning models through a single, consistent API.
  • Modular Design: Integrate and swap out models without altering the core framework.
  • Ease of Use: Simplifies model loading, input preprocessing, inference execution, and output postprocessing.
  • Extensibility: Add support for new models and libraries with minimal code changes.

Quickstart

Here's a quick example demonstrating how to use x.infer with a Transformers model:

import xinfer

model = xinfer.create_model("vikhyatk/moondream2")

image = "https://raw.githubusercontent.com/vikhyat/moondream/main/assets/demo-1.jpg"
prompt = "Describe this image. "

model.infer(image, prompt)

>>> An animated character with long hair and a serious expression is eating a large burger at a table, with other characters in the background.

Supported Libraries

  • Hugging Face Transformers: Natural language processing models for tasks like text classification, translation, and summarization.
  • Ultralytics: State-of-the-art real-time object detection models.
  • Custom Models: Support for your own machine learning models and architectures.

Prerequisites

Install PyTorch.

Installation

Install x.infer using pip:

pip install xinfer

With specific libraries:

pip install "xinfer[transformers]"
pip install "xinfer[ultralytics]"
pip install "xinfer[timm]"

Install all optional dependencies:

pip install "xinfer[all]"

Or install locally:

pip install -e .

With specific libraries (local installation):

pip install -e ".[transformers]"
pip install -e ".[ultralytics]"
pip install -e ".[timm]"

Install all optional dependencies (local installation):

pip install -e ".[all]"

See example.ipynb for more examples.

Usage

Supported Models

Transformers:

  • BLIP2 Series
model = xinfer.create_model("Salesforce/blip2-opt-2.7b")
  • Moondream2
model = xinfer.create_model("vikhyatk/moondream2")

[!NOTE] Wish to load an unlisted model? You can load any Vision2Seq model from Transformers by using the Vision2SeqModel class.

from xinfer.transformers import Vision2SeqModel

model = Vision2SeqModel("facebook/chameleon-7b")
model = xinfer.create_model(model)

TIMM:

  • EVA02 Series
model = xinfer.create_model("eva02_small_patch14_336.mim_in22k_ft_in1k")

[!NOTE] Wish to load an unlisted model? You can load any model from TIMM by using the TIMMModel class.

from xinfer.timm import TimmModel

model = TimmModel("resnet18")
model = xinfer.create_model(model)

Ultralytics:

  • YOLOv8 Series
model = xinfer.create_model("yolov8n")
  • YOLOv10 Series
model = xinfer.create_model("yolov10x")
  • YOLOv11 Series
model = xinfer.create_model("yolov11s")

[!NOTE] Wish to load an unlisted model? You can load any model from Ultralytics by using the UltralyticsModel class.

from xinfer.ultralytics import UltralyticsModel

model = UltralyticsModel("yolov5n6u")
model = xinfer.create_model(model)

List Models

import xinfer

xinfer.list_models()
Available Models
Implementation Model ID Input --> Output
timm eva02_large_patch14_448.mim_m38m_ft_in22k_in1k image --> class
timm eva02_large_patch14_448.mim_m38m_ft_in1k image --> class
timm eva02_large_patch14_448.mim_in22k_ft_in22k_in1k image --> class
timm eva02_large_patch14_448.mim_in22k_ft_in1k image --> class
timm eva02_base_patch14_448.mim_in22k_ft_in22k_in1k image --> class
timm eva02_base_patch14_448.mim_in22k_ft_in1k image --> class
timm eva02_small_patch14_336.mim_in22k_ft_in1k image --> class
timm eva02_tiny_patch14_336.mim_in22k_ft_in1k image --> class
transformers Salesforce/blip2-opt-6.7b-coco image-text --> text
transformers Salesforce/blip2-flan-t5-xxl image-text --> text
transformers Salesforce/blip2-opt-6.7b image-text --> text
transformers Salesforce/blip2-opt-2.7b image-text --> text
transformers vikhyatk/moondream2 image-text --> text
ultralytics yolov8x image --> objects
ultralytics yolov8m image --> objects
ultralytics yolov8l image --> objects
ultralytics yolov8s image --> objects
ultralytics yolov8n image --> objects
ultralytics yolov10x image --> objects
ultralytics yolov10m image --> objects
...
...

Adding New Models

  • Step 1: Create a new model class that implements the BaseModel interface.

  • Step 2: Implement the required abstract methods load_model, infer, and infer_batch.

  • Step 3: Decorate your class with the register_model decorator, specifying the model ID, implementation, and input/output.

For example:

@xinfer.register_model("my-model", "custom", ModelInputOutput.IMAGE_TEXT_TO_TEXT)
class MyModel(BaseModel):
    def load_model(self):
        # Load your model here
        pass

    def infer(self, image, prompt):
        # Run single inference 
        pass

    def infer_batch(self, images, prompts):
        # Run batch inference here
        pass

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xinfer-0.0.4.tar.gz (27.0 MB view details)

Uploaded Source

Built Distribution

xinfer-0.0.4-py2.py3-none-any.whl (31.5 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file xinfer-0.0.4.tar.gz.

File metadata

  • Download URL: xinfer-0.0.4.tar.gz
  • Upload date:
  • Size: 27.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for xinfer-0.0.4.tar.gz
Algorithm Hash digest
SHA256 33096c3faef72881fe96bb652d1af5b6239a5112a6751de96e226dd7f3d092b3
MD5 8ee32dcddcf0f12ccb534c7be0e44578
BLAKE2b-256 1137cd7f3c7cd6bc523be4b6837e3722928bf8fbbe80d5024463fd892d3c57df

See more details on using hashes here.

File details

Details for the file xinfer-0.0.4-py2.py3-none-any.whl.

File metadata

  • Download URL: xinfer-0.0.4-py2.py3-none-any.whl
  • Upload date:
  • Size: 31.5 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for xinfer-0.0.4-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 792942ab25f21111b7f5cda0d130a643d1f473d0a275f4bdb578d8b3572fedac
MD5 03f1d1e3bda0596247971fbcc004c678
BLAKE2b-256 0d44a8abcfa1e115f84e0a5deb93c083e24c239ce964bbe9d524b334255fc865

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page