High-throughput BGE-M3 inference engine with dense + sparse embeddings
Project description
m3serve
Lightweight async inference engine for BAAI/bge-m3 that returns dense and sparse embeddings in a single call — enabling hybrid retrieval without the overhead of a full LLM framework.
Install
pip install m3serve
Usage
from m3serve import Engine
engine = Engine(model_name="BAAI/bge-m3", use_fp16=True)
await engine.start()
result = await engine.embed(["hello world"], return_sparse=True)
# result.dense -> list[list[float]] (1024-dim)
# result.sparse_indices -> list[list[int]] (token ids with non-zero weight)
# result.sparse_weights -> list[list[float]] (corresponding weights)
await engine.stop()
How it works
Three background threads run in a pipeline so the GPU is never idle waiting for tokenisation or post-processing:
Thread 1 encode_pre tokenise on CPU ──┐
Thread 2 encode_core GPU forward pass ◄──┘ └──►
Thread 3 encode_post convert to Python lists └──► resolved Future
Incoming requests are queued and batched by token length (shorter sequences first) to minimise padding waste. Each embed() call is a coroutine that returns as soon as its batch is processed — no polling, no callbacks.
Options
| Parameter | Default | Description |
|---|---|---|
model_name |
"BAAI/bge-m3" |
Any bge-m3 compatible model |
device |
auto-detected | "cuda:0", "mps", "cpu" |
use_fp16 |
True |
Half-precision inference (ignored on CPU) |
torch_compile |
False |
torch.compile the backbone (CUDA only, adds warmup) |
max_batch_size |
256 |
Maximum sequences per GPU batch |
batch_delay |
0.005 |
Coalescing window in seconds — sleep after first item arrives to let concurrent requests accumulate. Set to ~½ × GPU inference time for your batch size. |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file m3serve-0.1.3.tar.gz.
File metadata
- Download URL: m3serve-0.1.3.tar.gz
- Upload date:
- Size: 6.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9367b625f09c2540e912635b505cadd62943daf2b25d21eda12cd85b40670797
|
|
| MD5 |
7f479ca4eb1d784ec843cd79474a7c8c
|
|
| BLAKE2b-256 |
b70697a8e66003952fb624338511b9ddd6c42aa06608c48250a03cc7ca949461
|
Provenance
The following attestation bundles were made for m3serve-0.1.3.tar.gz:
Publisher:
publish.yml on MauroCE/m3serve
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
m3serve-0.1.3.tar.gz -
Subject digest:
9367b625f09c2540e912635b505cadd62943daf2b25d21eda12cd85b40670797 - Sigstore transparency entry: 1358358098
- Sigstore integration time:
-
Permalink:
MauroCE/m3serve@3a55ea952b4325ea3aa5e8b5beb41a6502a8910f -
Branch / Tag:
refs/tags/v0.1.3 - Owner: https://github.com/MauroCE
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@3a55ea952b4325ea3aa5e8b5beb41a6502a8910f -
Trigger Event:
release
-
Statement type:
File details
Details for the file m3serve-0.1.3-py3-none-any.whl.
File metadata
- Download URL: m3serve-0.1.3-py3-none-any.whl
- Upload date:
- Size: 8.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5b7396c3891d65c6fbb6f90efa3e27cf438888cd2e6205b2fe6b09e9007dfe31
|
|
| MD5 |
7b67363018650b67e7b1dc6b7ed23099
|
|
| BLAKE2b-256 |
5788c4a1f61ab79034038da3b76ed4ab7f47cb069bf098387ef6ce0537958fba
|
Provenance
The following attestation bundles were made for m3serve-0.1.3-py3-none-any.whl:
Publisher:
publish.yml on MauroCE/m3serve
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
m3serve-0.1.3-py3-none-any.whl -
Subject digest:
5b7396c3891d65c6fbb6f90efa3e27cf438888cd2e6205b2fe6b09e9007dfe31 - Sigstore transparency entry: 1358358227
- Sigstore integration time:
-
Permalink:
MauroCE/m3serve@3a55ea952b4325ea3aa5e8b5beb41a6502a8910f -
Branch / Tag:
refs/tags/v0.1.3 - Owner: https://github.com/MauroCE
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@3a55ea952b4325ea3aa5e8b5beb41a6502a8910f -
Trigger Event:
release
-
Statement type: