Skip to main content

Just a bunch of useful embeddings to get started quickly.

Project description

embetter

"Just a bunch of useful embeddings for scikit-learn pipelines, to get started quickly."


Embetter implements scikit-learn compatible embeddings for computer vision and text. It should make it very easy to quickly build proof of concepts using scikit-learn pipelines and, in particular, should help with bulk labelling. It's also meant to play nice with bulk and scikit-partial but it can also be used together with your favorite ANN solution like lancedb.

Install

You can install via pip.

python -m pip install embetter

Many of the embeddings are optional depending on your use-case, so if you want to nit-pick to download only the tools that you need:

python -m pip install "embetter[text]"
python -m pip install "embetter[spacy]"
python -m pip install "embetter[sense2vec]"
python -m pip install "embetter[vision]"
python -m pip install "embetter[all]"

API Design

This is what's being implemented now.

# Helpers to grab text or image from pandas column.
from embetter.grab import ColumnGrabber

# Representations/Helpers for computer vision
from embetter.vision import ImageLoader, TimmEncoder, ColorHistogramEncoder

# Representations for text
from embetter.text import SentenceEncoder, MatryoshkaEncoder, Sense2VecEncoder, spaCyEncoder, TextEncoder

# Representations from multi-modal models
from embetter.multi import ClipEncoder

# Finetuning components 
from embetter.finetune import FeedForwardTuner, ContrastiveTuner, ContrastiveLearner, SbertLearner

# External embedding providers, typically needs an API key
from embetter.external import CohereEncoder, OpenAIEncoder

All of these components are scikit-learn compatible, which means that you can apply them as you would normally in a scikit-learn pipeline. Just be aware that these components are stateless. They won't require training as these are all pretrained tools.

Text Example

To run this example, make sure that you pip install 'embetter[sbert]'.

import pandas as pd
from sklearn.pipeline import make_pipeline 
from sklearn.linear_model import LogisticRegression

from embetter.grab import ColumnGrabber
from embetter.text import SentenceEncoder

# This pipeline grabs the `text` column from a dataframe
# which then get fed into Sentence-Transformers' all-MiniLM-L6-v2.
text_emb_pipeline = make_pipeline(
  ColumnGrabber("text"),
  SentenceEncoder('all-MiniLM-L6-v2')
)

# This pipeline can also be trained to make predictions, using
# the embedded features. 
text_clf_pipeline = make_pipeline(
  text_emb_pipeline,
  LogisticRegression()
)

dataf = pd.DataFrame({
  "text": ["positive sentiment", "super negative"],
  "label_col": ["pos", "neg"]
})
X = text_emb_pipeline.fit_transform(dataf, dataf['label_col'])
text_clf_pipeline.fit(dataf, dataf['label_col']).predict(dataf)

Image Example

The goal of the API is to allow pipelines like this:

import pandas as pd
from sklearn.pipeline import make_pipeline 
from sklearn.linear_model import LogisticRegression

from embetter.grab import ColumnGrabber
from embetter.vision import ImageLoader
from embetter.multi import ClipEncoder

# This pipeline grabs the `img_path` column from a dataframe
# then it grabs the image paths and turns them into `PIL.Image` objects
# which then get fed into CLIP which can also handle images.
image_emb_pipeline = make_pipeline(
  ColumnGrabber("img_path"),
  ImageLoader(convert="RGB"),
  ClipEncoder()
)

dataf = pd.DataFrame({
  "img_path": ["tests/data/thiscatdoesnotexist.jpeg"]
})
image_emb_pipeline.fit_transform(dataf)

Batched Learning

All of the encoding tools you've seen here are also compatible with the partial_fit mechanic in scikit-learn. That means you can leverage scikit-partial to build pipelines that can handle out-of-core datasets.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

embetter-0.8.0.tar.gz (24.0 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

embetter-0.8.0-py3-none-any.whl (36.2 kB view details)

Uploaded Python 3

embetter-0.8.0-py2.py3-none-any.whl (39.9 kB view details)

Uploaded Python 2Python 3

File details

Details for the file embetter-0.8.0.tar.gz.

File metadata

  • Download URL: embetter-0.8.0.tar.gz
  • Upload date:
  • Size: 24.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.13

File hashes

Hashes for embetter-0.8.0.tar.gz
Algorithm Hash digest
SHA256 9b316f6c162975d570ae9465969f302f0e735973069473971c857053bf32ce1e
MD5 6d74e7eeefdd66153e716a284251bd2d
BLAKE2b-256 291c9ce2ddee2a824c24ce84f5d54fc6ebce7adb9515505dbe1267c8f8373309

See more details on using hashes here.

File details

Details for the file embetter-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: embetter-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 36.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.13

File hashes

Hashes for embetter-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8661030fbe6951d07f7f3c4d95b74c79acfa678d7ffddc3306936981ed99da5c
MD5 12a20990a3b53c8810b38ab354d239f5
BLAKE2b-256 4878fbf4ceb155b8d1786957080681c31b492bec8653e5d261cc6634a2dc3b6e

See more details on using hashes here.

File details

Details for the file embetter-0.8.0-py2.py3-none-any.whl.

File metadata

  • Download URL: embetter-0.8.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 39.9 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.13

File hashes

Hashes for embetter-0.8.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 d90ce16d4113e4d769f1d2169be55dd07653ebbc4c7448fa2f52e89ab10f8f13
MD5 b1f1d47f93bcdbe196808bf244e69e04
BLAKE2b-256 5d1022fddff34ddac70b5a090108d97163a60d79acbe8e1a6cbfcb84857f3778

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page