Fast and Lightweight Text Embedding
Project description
LightEmbed
light-embed-awslambda
is a lightweight, fast, and efficient tool designed to generate sentence embeddings in AWS Lambda functions. Unlike tools that rely on heavy dependencies such as PyTorch or Transformers, it is optimized for environments with limited resources, making it ideal for serverless applications.
Benefits
1. Light-weight
- Minimal Dependencies: LightEmbed does not depend on PyTorch and Transformers.
- Low Resource Requirements: Operates smoothly with minimal specs: 1GB RAM, 1 CPU, and no GPU required.
2. Fast (as light)
- ONNX Runtime: Utilizes the ONNX runtime, which is significantly faster compared to Sentence Transformers that use PyTorch.
3. Consistent with Sentence Transformers
- Consistency: Incorporates all modules from a Sentence Transformer model, including normalization and pooling.
- Accuracy: Produces embedding vectors identical to those from Sentence Transformers.
4. Supports models not managed by LightEmbed
LightEmbed can work with any Hugging Face repository, even those not hosted on Hugging Face ONNX models, as long as ONNX files are available.
5. Local Model Support
LightEmbed can load models from the local file system, enabling faster loading times and functionality in environments without internet access, such as AWS Lambda or EC2 instances in private subnets.
Installation
pip install -U light-embed-awslambda
Usage
Then you can specify the original model name
like this:
from light_embed import TextEmbedding
sentences = ["This is an example sentence", "Each sentence is converted"]
model = TextEmbedding(model_name_or_path='sentence-transformers/all-MiniLM-L6-v2')
embeddings = model.encode(sentences)
print(embeddings)
or, alternatively, you can specify the onnx model name
like this:
from light_embed import TextEmbedding
sentences = ["This is an example sentence", "Each sentence is converted"]
model = TextEmbedding(model_name_or_path='onnx-models/all-MiniLM-L6-v2-onnx')
embeddings = model.encode(sentences)
print(embeddings)
Using a Non-Managed Model: To use a model from its original repository without relying on Hugging Face ONNX models, simply specify the model name and provide the model_config
, assuming the original repository includes ONNX files.
from light_embed import TextEmbedding
sentences = ["This is an example sentence", "Each sentence is converted"]
model_config = {
"model_file": "onnx/model.onnx",
"pooling_config_path": "1_Pooling",
"normalize": False
}
model = TextEmbedding(
model_name_or_path='sentence-transformers/all-MiniLM-L6-v2',
model_config=model_config
)
embeddings = model.encode(sentences)
print(embeddings)
Using a Local Model: To use a local model, specify the path to the model's folder and provide the model_config
.
from light_embed import TextEmbedding
sentences = ["This is an example sentence", "Each sentence is converted"]
model_config = {
"model_file": "onnx/model.onnx",
"pooling_config_path": "1_Pooling",
"normalize": False
}
model = TextEmbedding(
model_name_or_path='/path/to/the/local/model/all-MiniLM-L6-v2-onnx',
model_config=model_config
)
embeddings = model.encode(sentences)
print(embeddings)
or, alternatively, you can specify the onnx model name
like this:
from light_embed import TextEmbedding
sentences = ["This is an example sentence", "Each sentence is converted"]
model = TextEmbedding('onnx-models/all-MiniLM-L6-v2-onnx')
embeddings = model.encode(sentences)
print(embeddings)
Citing & Authors
Binh Nguyen / binhcode25@gmail.com
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file light_embed_awslambda-1.0.0.tar.gz
.
File metadata
- Download URL: light_embed_awslambda-1.0.0.tar.gz
- Upload date:
- Size: 13.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 26755f34a13374d5a985c82a8f545d6f8814d204e254f35443c580d43eae691f |
|
MD5 | 26fe9bdf9c89501d55bd57928085bfc1 |
|
BLAKE2b-256 | b7802c2691acf37d0f5c416ce8eb72b41d8beb6830232099a456d068440af3fb |
File details
Details for the file light_embed_awslambda-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: light_embed_awslambda-1.0.0-py3-none-any.whl
- Upload date:
- Size: 15.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9b61d67e0483833a80fef0dad4062ef40cdbe15028f778d8dfa47fada69cd576 |
|
MD5 | 985b6e148fd9ed3a48d2ebe8f3ae8a9b |
|
BLAKE2b-256 | 9cd4944372fbf8d5278d8bfce0904e3b6e8b06159e2eb9aed87e13fb67aef481 |