Skip to main content

Turn unstructured data into vectors

Project description

Radient

Radient is a developer-friendly, lightweight library for unstructured data ETL, i.e. turning audio, graphs, images, molecules, text, and other data types into embeddings. Radient supports simple vectorization as well as complex vector-centric workflows.

$ pip install radient

If you find this project helpful or interesting, please consider giving it a star. :star:

Getting started

Basic vectorization can be performed as follows:

from radient import text_vectorizer
vz = text_vectorizer()
vz.vectorize("Hello, world!")
# Vector([-3.21440510e-02, -5.10351397e-02,  3.69579718e-02, ...])

The above snippet vectorizes the string "Hello, world!" using a default model, namely bge-small-en-v1.5 from sentence-transformers. If your Python environment does not contain the sentence-transformers library, Radient will prompt you for it:

vz = text_vectorizer()
# Vectorizer requires sentence-transformers. Install? [Y/n]

You can type "Y" to have Radient install it for you automatically.

Each vectorizer can take a method parameter along with optional keyword arguments which get passed directly to the underlying vectorization library. For example, we can pick Mixbread AI's mxbai-embed-large-v1 model using the sentence-transformers library via:

vz_mbai = text_vectorizer(method="sentence-transformers", model_name_or_path="mixedbread-ai/mxbai-embed-large-v1")
vz_mbai.vectorize("Hello, world!")
# Vector([ 0.01729078,  0.04468533,  0.00055427, ...])

More than just text

With Radient, you're not limited to text. Audio, graphs, images, and molecules can be vectorized as well:

from radient import (
    audio_vectorizer,
    graph_vectorizer,
    image_vectorizer,
    molecule_vectorizer,
)
avec = audio_vectorizer().vectorize(str(Path.home() / "audio.wav"))
gvec = graph_vectorizer().vectorize(nx.karate_club_graph())
ivec = image_vectorizer().vectorize(str(Path.home() / "image.jpg"))
mvec = molecule_vectorizer().vectorize("O=C=O")

A partial list of methods and optional kwargs supported by each modality can be found here.

For production use cases with large quantities of data, performance is key. Radient also provides an accelerate function to optimize vectorizers on-the-fly:

import numpy as np
vz = text_vectorizer()
vec0 = vz.vectorize("Hello, world!")
vz.accelerate()
vec1 = vz.vectorize("Hello, world!")
np.allclose(vec0, vec1)
# True

On a 2.3 GHz Quad-Core Intel Core i7, the original vectorizer returns in ~32ms, while the accelerated vectorizer returns in ~17ms.

Building unstructured data ETL

Aside from running experiments, pure vectorization is not particularly useful. Mirroring strutured data ETL pipelines, unstructured data ETL workloads often require a combination of four components: a data source where unstructured data is stored, one more more transform modules that perform data conversions and pre-processing, a vectorizer which turns the data into semantically rich embeddings, and a sink to persist the vectors once they have been computed.

Radient provides a Workflow object specifically for building vector-centric ETL applications. With Workflows, you can combine any number of each of these components into a directed graph. For example, a workflow to continuously read text documents from Google Drive, vectorize them with Voyage AI, and vectorize them into Milvus might look like:

from radient import make_operator
from radient import Workflow

extract = make_operator("source", method="google-drive", task_params={"folder": "My Files"})
transform = make_operator("transform", method="read-text", task_params={})
vectorize = make_operator("vectorizer", method="voyage-ai", modality="text", task_params={})
load = make_operator("sink", method="milvus", task_params={"operation": "insert"})

wf = (
    Workflow()
    .add(extract, name="extract")
    .add(transform, name="transform")
    .add(vectorize, name="vectorize")
    .add(load, name="load")
)

You can use accelerated vectorizers and transforms in a Workflow by specifying accelerate=True for all supported operators.

Supported vectorizer engines

Radient builds atop work from the broader ML community. Most vectorizers come from other libraries:

On-the-fly model acceleration is done via ONNX.

A massive thank you to all the creators and maintainers of these libraries.

Coming soon™

A couple of features slated for the near-term (hopefully):

  1. Sparse vector, binary vector, and multi-vector support
  2. Support for all relevant embedding models on Huggingface

LLM connectors will not be a feature that Radient provides. Building context-aware systems around LLMs is a complex task, and not one that Radient intends to solve. Projects such as Haystack and Llamaindex are two of the many great options to consider if you're looking to extract maximum RAG performance.

Full write-up on Radient will come later, along with more sample applications, so stay tuned.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

radient-2024.7.31.tar.gz (26.7 kB view details)

Uploaded Source

Built Distribution

radient-2024.7.31-py3-none-any.whl (41.0 kB view details)

Uploaded Python 3

File details

Details for the file radient-2024.7.31.tar.gz.

File metadata

  • Download URL: radient-2024.7.31.tar.gz
  • Upload date:
  • Size: 26.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.6

File hashes

Hashes for radient-2024.7.31.tar.gz
Algorithm Hash digest
SHA256 ea4318b49ad8839771c7600e82a0359ad7e94c9b67d12cf9f421332f478edcdd
MD5 b0aaf0e4428ae60575abbe587fc721f5
BLAKE2b-256 e6d09b7bfe1159d0f31d43062c9367aa761b1d82051001b8ab20672718343772

See more details on using hashes here.

File details

Details for the file radient-2024.7.31-py3-none-any.whl.

File metadata

  • Download URL: radient-2024.7.31-py3-none-any.whl
  • Upload date:
  • Size: 41.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.6

File hashes

Hashes for radient-2024.7.31-py3-none-any.whl
Algorithm Hash digest
SHA256 119d428d59a058cc8b6feb086450ac06bd404dcf507caf303547ae6ea96cf0f7
MD5 643747e1454054b1af68a05cd5070ce0
BLAKE2b-256 d40534b1005a2f3c60e46e73bf7a2dc36ab47ca33dcf296db994af52f2e72e7d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page