Skip to main content

A unified interface to run inference on machine learning libraries.

Project description

Python PyPI version Downloads License

🤔 Why x.infer?

If you'd like to run many models from different libraries without having to rewrite your inference code, x.infer is for you. It has a simple API and is easy to extend.

Models supported:

Transformers TIMM Ultralytics vLLM

Run any supported model using the following 4 lines of code:

import xinfer

model = xinfer.create_model("vikhyatk/moondream2")
model.infer(image, prompt)         # Run single inference
model.infer_batch(images, prompts) # Run batch inference
model.launch_gradio()              # Launch Gradio interface

Have a custom model? Create a class that implements the BaseModel interface and register it with x.infer. See Adding New Models for more details.

🌟 Key Features

x.infer
  • Unified Interface: Interact with different machine learning models through a single, consistent API.
  • Modular Design: Integrate and swap out models without altering the core framework.
  • Ease of Use: Simplifies model loading, input preprocessing, inference execution, and output postprocessing.
  • Extensibility: Add support for new models and libraries with minimal code changes.

🚀 Quickstart

Here's a quick example demonstrating how to use x.infer with a Transformers model:

Open In Colab Open In Kaggle

import xinfer

model = xinfer.create_model("vikhyatk/moondream2")

image = "https://raw.githubusercontent.com/vikhyat/moondream/main/assets/demo-1.jpg"
prompt = "Describe this image. "

model.infer(image, prompt)

>>> An animated character with long hair and a serious expression is eating a large burger at a table, with other characters in the background.

Get a list of models:

xinfer.list_models()
       Available Models                                      
┏━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━┓
┃ Implementation ┃ Model ID                                        ┃ Input --> Output     ┃
┡━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━┩
│ timm           │ eva02_large_patch14_448.mim_m38m_ft_in22k_in1k  │ image --> categories │
│ timm           │ eva02_large_patch14_448.mim_m38m_ft_in1k        │ image --> categories │
│ timm           │ eva02_large_patch14_448.mim_in22k_ft_in22k_in1k │ image --> categories │
│ timm           │ eva02_large_patch14_448.mim_in22k_ft_in1k       │ image --> categories │
│ timm           │ eva02_base_patch14_448.mim_in22k_ft_in22k_in1k  │ image --> categories │
│ timm           │ eva02_base_patch14_448.mim_in22k_ft_in1k        │ image --> categories │
│ timm           │ eva02_small_patch14_336.mim_in22k_ft_in1k       │ image --> categories │
│ timm           │ eva02_tiny_patch14_336.mim_in22k_ft_in1k        │ image --> categories │
│ transformers   │ Salesforce/blip2-opt-6.7b-coco                  │ image-text --> text  │
│ transformers   │ Salesforce/blip2-flan-t5-xxl                    │ image-text --> text  │
│ transformers   │ Salesforce/blip2-opt-6.7b                       │ image-text --> text  │
│ transformers   │ Salesforce/blip2-opt-2.7b                       │ image-text --> text  │
│ transformers   │ fancyfeast/llama-joycaption-alpha-two-hf-llava  │ image-text --> text  │
│ transformers   │ vikhyatk/moondream2                             │ image-text --> text  │
│ transformers   │ sashakunitsyn/vlrm-blip2-opt-2.7b               │ image-text --> text  │
│ ultralytics    │ yolov8x                                         │ image --> boxes      │
│ ultralytics    │ yolov8m                                         │ image --> boxes      │
│ ultralytics    │ yolov8l                                         │ image --> boxes      │
│ ultralytics    │ yolov8s                                         │ image --> boxes      │
│ ultralytics    │ yolov8n                                          image --> boxes      │
│ ...            │ ...                                             │ ...                  │
│ ...            │ ...                                             │ ...                  │
└────────────────┴─────────────────────────────────────────────────┴──────────────────────┘

🖥️ Launch Gradio Interface

model.launch_gradio()

Gradio Interface

📦 Installation

[!IMPORTANT] You must have PyTorch installed to use x.infer.

To install the barebones x.infer (without any optional dependencies), run:

pip install xinfer

x.infer can be used with multiple optional libraries. You'll just need to install one or more of the following:

pip install "xinfer[transformers]"
pip install "xinfer[ultralytics]"
pip install "xinfer[timm]"

To install all libraries, run:

pip install "xinfer[all]"

To install from a local directory, run:

git clone https://github.com/dnth/x.infer.git
cd x.infer
pip install -e .

🛠️ Usage

Supported Models

Transformers
Model Usage
BLIP2 Series xinfer.create_model("Salesforce/blip2-opt-2.7b")
Moondream2 xinfer.create_model("vikhyatk/moondream2")
VLRM-BLIP2 xinfer.create_model("sashakunitsyn/vlrm-blip2-opt-2.7b")
JoyCaption xinfer.create_model("fancyfeast/llama-joycaption-alpha-two-hf-llava")

You can also load any Vision2Seq model from Transformers by using the Vision2SeqModel class.

from xinfer.transformers import Vision2SeqModel

model = Vision2SeqModel("facebook/chameleon-7b")
model = xinfer.create_model(model)
TIMM

All models from TIMM fine-tuned for ImageNet 1k are supported.

For example load a resnet18.a1_in1k model:

xinfer.create_model("resnet18.a1_in1k")

You can also load any model (or a custom timm model) by using the TIMMModel class.

from xinfer.timm import TimmModel

model = TimmModel("resnet18")
model = xinfer.create_model(model)
Ultralytics
Model Usage
YOLOv8 Series xinfer.create_model("yolov8n")
YOLOv10 Series xinfer.create_model("yolov10x")
YOLOv11 Series xinfer.create_model("yolov11s")

You can also load any model from Ultralytics by using the UltralyticsModel class.

from xinfer.ultralytics import UltralyticsModel

model = UltralyticsModel("yolov5n6u")
model = xinfer.create_model(model)
vLLM
Model Usage
Molmo-72B xinfer.create_model("allenai/Molmo-72B-0924")
Molmo-7B-D xinfer.create_model("allenai/Molmo-7B-D-0924")
Molmo-7B-O xinfer.create_model("allenai/Molmo-7B-O-0924")

🔧 Adding New Models

  • Step 1: Create a new model class that implements the BaseModel interface.

  • Step 2: Implement the required abstract methods load_model, infer, and infer_batch.

  • Step 3: Decorate your class with the register_model decorator, specifying the model ID, implementation, and input/output.

For example:

@xinfer.register_model("my-model", "custom", ModelInputOutput.IMAGE_TEXT_TO_TEXT)
class MyModel(BaseModel):
    def load_model(self):
        # Load your model here
        pass

    def infer(self, image, prompt):
        # Run single inference 
        pass

    def infer_batch(self, images, prompts):
        # Run batch inference here
        pass

Back to Top

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xinfer-0.0.10.tar.gz (33.4 MB view details)

Uploaded Source

Built Distribution

xinfer-0.0.10-py2.py3-none-any.whl (37.3 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file xinfer-0.0.10.tar.gz.

File metadata

  • Download URL: xinfer-0.0.10.tar.gz
  • Upload date:
  • Size: 33.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for xinfer-0.0.10.tar.gz
Algorithm Hash digest
SHA256 1c635b9dedcec0ffd0ea53adaa4f13c76a63f65fc1ae87be63612bc24cfff365
MD5 e2dc5783b53c1d96e54692c6ec88c459
BLAKE2b-256 56f10f895c9c7853c5b0b0600e089015a3f6ac94dc1634d8750963f6d9af7594

See more details on using hashes here.

File details

Details for the file xinfer-0.0.10-py2.py3-none-any.whl.

File metadata

  • Download URL: xinfer-0.0.10-py2.py3-none-any.whl
  • Upload date:
  • Size: 37.3 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for xinfer-0.0.10-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f691d881cb105790768f4f17e0f503b8bae9a993b1399f28c5902e27c082fcef
MD5 af304a299bb16ce3fa2a7768b23eac95
BLAKE2b-256 acbc8b2b5e5228d57c12f1d2015706677773e5ae93fda61b0f061011a8740784

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page