CLIP with ONNX Runtime and without PyTorch dependencies.
Project description
onnx_clip
An ONNX-based implementation of CLIP that doesn't
depend on torch
or torchvision
.
It also has a friendlier API than the original implementation.
This works by
- running the text and vision encoders (the ViT-B/32 variant) in ONNX Runtime
- using a pure NumPy version of the tokenizer
- using a pure NumPy+PIL version of the preprocess function.
The PIL dependency could also be removed with minimal code changes - see
preprocessor.py
.
Installation
To install, run the following in the root of the repository:
pip install .
Usage
All you need to do is call the OnnxClip
model class. An example:
from onnx_clip import OnnxClip, softmax, get_similarity_scores
from PIL import Image
images = [Image.open("onnx_clip/data/franz-kafka.jpg").convert("RGB")]
texts = ["a photo of a man", "a photo of a woman"]
# Your images/texts will get split into batches of this size before being
# passed to CLIP, to limit memory usage
onnx_model = OnnxClip(batch_size=16)
# Unlike the original CLIP, there is no need to run tokenization/preprocessing
# separately - simply run get_image_embeddings directly on PIL images/NumPy
# arrays, and run get_text_embeddings directly on strings.
image_embeddings = onnx_model.get_image_embeddings(images)
text_embeddings = onnx_model.get_text_embeddings(texts)
# To use the embeddings for zero-shot classification, you can use these two
# functions. Here we run on a single image, but any number is supported.
logits = get_similarity_scores(image_embeddings, text_embeddings)
probabilities = softmax(logits)
print("Logits:", logits)
for text, p in zip(texts, probabilities[0]):
print(f"Probability that the image is '{text}': {p:.3f}")
Building & developing from source
Note: The following may give timeout errors due to the filesizes. If so, this can be fixed with poetry version 1.1.13 - see this related issue.
Install, run, build and publish with Poetry
Install Poetry
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
To setup the project and create a virtual environment run the following command from the project's root directory.
poetry install
To build a source and wheel distribution of the library run the following command from the project's root directory.
poetry build
Publishing a new version to PyPI (for project maintainers)
First, remove/move the downloaded LFS files, so that they're not packaged with the code.
Otherwise, this creates a huge .whl
file that PyPI refuses and it causes confusing errors.
Then, follow this guide.
tl;dr: go to the PyPI account page, generate an API token
and put it into the $PYPI_PASSWORD
environment variable. Then run
poetry publish --build --username lakera --password $PYPI_PASSWORD
Help
Please let us know how we can support you: earlyaccess@lakera.ai.
LICENSE
See the LICENSE file in this repository.
The franz-kafka.jpg
is taken from here.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file onnx_clip-4.0.1.tar.gz
.
File metadata
- Download URL: onnx_clip-4.0.1.tar.gz
- Upload date:
- Size: 1.9 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.6.1 CPython/3.10.12 Darwin/23.0.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 29277b0b1fbc19b7a41a7e6a91dc80fb3e0671300aa1cc10c9d54e5a4255f2d4 |
|
MD5 | f281e47c610307d27d1e9a74b33dade1 |
|
BLAKE2b-256 | f1c0a7a390be7e8a5cb99fe880aa1a6a164617f48d506f74375a77d468b2f4a5 |
File details
Details for the file onnx_clip-4.0.1-py3-none-any.whl
.
File metadata
- Download URL: onnx_clip-4.0.1-py3-none-any.whl
- Upload date:
- Size: 1.8 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.6.1 CPython/3.10.12 Darwin/23.0.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f20ab1267f2568b1685643e42f750c59721f1bf1d3ba49deae8f7cf3070cf40e |
|
MD5 | ff2af99c169c44f8f60c0c3eabe64239 |
|
BLAKE2b-256 | 9f8533c883e7c1c58dc57aafebdddde1c093fd8f40858f90aef02651813a582f |