Skip to main content

High-throughput BGE-M3 inference engine with dense + sparse embeddings

Project description

m3serve

License: MIT CI

Lightweight async inference engine for BAAI/bge-m3 that returns dense and sparse embeddings in a single call — enabling hybrid retrieval without the overhead of a full LLM framework.

Install

pip install m3serve

Usage

from m3serve import Engine

engine = Engine(model_name="BAAI/bge-m3", use_fp16=True)
await engine.start()

result = await engine.embed(["hello world"], return_sparse=True)
# result.dense            -> list[list[float]]  (1024-dim)
# result.sparse_indices   -> list[list[int]]    (token ids with non-zero weight)
# result.sparse_weights   -> list[list[float]]  (corresponding weights)

await engine.stop()

How it works

Three background threads run in a pipeline so the GPU is never idle waiting for tokenisation or post-processing:

Thread 1  encode_pre   tokenise on CPU        ──┐
Thread 2  encode_core  GPU forward pass    ◄──┘  └──►
Thread 3  encode_post  convert to Python lists       └──► resolved Future

Incoming requests are queued and batched by token length (shorter sequences first) to minimise padding waste. Each embed() call is a coroutine that returns as soon as its batch is processed — no polling, no callbacks.

Options

Parameter Default Description
model_name "BAAI/bge-m3" Any bge-m3 compatible model
device auto-detected "cuda:0", "mps", "cpu"
use_fp16 True Half-precision inference (ignored on CPU)
torch_compile False torch.compile the backbone (CUDA only, adds warmup)
max_batch_size 256 Maximum sequences per GPU batch
batch_delay 0.005 Coalescing window in seconds — sleep after first item arrives to let concurrent requests accumulate. Set to ~½ × GPU inference time for your batch size.
tokenizer_threads 4 Number of threads dedicated to tokenization (token_lengths). Each thread holds its own tokenizer copy; all are pre-warmed at start() so no cold deepcopy happens during serving.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

m3serve-0.1.4.tar.gz (6.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

m3serve-0.1.4-py3-none-any.whl (8.4 kB view details)

Uploaded Python 3

File details

Details for the file m3serve-0.1.4.tar.gz.

File metadata

  • Download URL: m3serve-0.1.4.tar.gz
  • Upload date:
  • Size: 6.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for m3serve-0.1.4.tar.gz
Algorithm Hash digest
SHA256 fb16cca080d01e0ce759cc4122bde02f08ce87c5accee00321f9dbbd436416f8
MD5 38a102fd937c66100e26fc29ed04426c
BLAKE2b-256 68ac30f09f1188f0ee32290d6d764532ccd580d06af02662168f09d8461ffbd0

See more details on using hashes here.

Provenance

The following attestation bundles were made for m3serve-0.1.4.tar.gz:

Publisher: publish.yml on MauroCE/m3serve

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file m3serve-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: m3serve-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 8.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for m3serve-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 429524e633f0e9e79b0075f798b80faad2e0bf4902b8c486fea1879d4febdaf2
MD5 2299f7ce77e49947fc4b3431900643e6
BLAKE2b-256 394360e9ff4877899e78f0dfe8e2d12e372258e4f15be31eb745a8e3aed326fc

See more details on using hashes here.

Provenance

The following attestation bundles were made for m3serve-0.1.4-py3-none-any.whl:

Publisher: publish.yml on MauroCE/m3serve

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page