Skip to main content

Dense embeddings and small-model inference for the Kelvin Agentic OS — Rust-native ONNX backend, model2vec static lookup, optional GPU

Project description

kaos-nlp-transformers

Part of Kelvin Agentic OS (KAOS) — open agentic infrastructure for legal work, built by 273 Ventures. See the full KAOS package map for the rest of the stack.

PyPI - Version Python License CI

kaos-nlp-transformers is the dense-embedding and small-model inference layer for KAOS — a typed Python API over an in-tree Rust cdylib that calls ort (libonnxruntime via Rust) to turn text into float32 vectors and back. It ships a license-vetted model registry, an optional cross-encoder reranker, and a semantic-dedup level that plugs into kaos-content's deduplication framework.

It is dependency-light at the BASE: the install pulls in only numpy, huggingface_hub, and the core KAOS runtime (kaos-core, kaos-content, kaos-nlp-core). No PyTorch, no Python fastembed, no Python onnxruntime — the inference path is a Rust cdylib (kaos_nlp_transformers._rust) shipped inside the wheel; libonnxruntime is statically linked. Both embedding (EmbeddingModel) and cross-encoder reranking (CrossEncoderReranker) run through the same backend on CPU out of the box. Optional extras layer in adjacencies — [gpu] for the GPU companion wheel (ort/cuda EP, NVIDIA), [openvino] for Intel OpenVINO acceleration, [model2vec] for the static-numpy lookup backend (~500x CPU speedup), [clustering] for SciPy-backed semantic dedup, and [mcp] for the MCP tool surface. Free-threaded Python (3.13t / 3.14t) is supported.

Install

uv add kaos-nlp-transformers
# or
pip install kaos-nlp-transformers

kaos-nlp-transformers requires Python 3.13 or newer (free-threaded 3.13t / 3.14t supported). The default install is CPU-only via the Rust ort backend. Add the extras you need:

uv add "kaos-nlp-transformers[gpu]"          # NVIDIA CUDA companion wheel (0.2.0a2)
uv add "kaos-nlp-transformers[openvino]"     # Intel CPU / GPU acceleration (0.2.0a2)
uv add "kaos-nlp-transformers[model2vec]"    # Static-numpy backend (~500x CPU)
uv add "kaos-nlp-transformers[clustering]"   # SemanticDedupLevel (scipy)
uv add "kaos-nlp-transformers[mcp]"          # MCP tool surface

0.2.0 migration note (KNT-601). Audit KNT-601 retired the Python fastembed wrapper. Inference now goes through a Rust cdylib (ort + libonnxruntime, statically linked). Same models, same outputs (per-row cosine ≥ 0.9999 vs the prior backend). The EmbeddingModel.load / EmbeddingModel.embed / CrossEncoderReranker public API is unchanged. The [gpu] / [openvino] extras are no-op stubs in 0.2.0a1; the GPU companion wheel ships in 0.2.0a2. The [torch] no-op alias from KNT-501 is still preserved for one more cycle; removed in 0.3.0. The EmbeddingRetriever text-only retriever is deprecated in favor of kaos_content.indexing.SearchableDocument and kaos_content.indexing.SearchableCorpus; removal scheduled for 0.3.0.

Platform coverage: per-platform cp313-abi3 wheels for Linux x86_64 + aarch64 (manylinux + musllinux), macOS aarch64, Windows x86_64 + aarch64. Free-threaded Python (3.13t / 3.14t) loads cleanly — no _check_gil_enabled guard, no py_rust_stemmers SIGSEGV path.

Quick start

import numpy as np
from kaos_nlp_transformers import EmbeddingModel

# Load the v0 default model (BAAI/bge-small-en-v1.5, 33M params, MIT).
# First call downloads and caches; subsequent calls are O(1).
model = EmbeddingModel.load("BAAI/bge-small-en-v1.5")

# Embed a small batch. Returns a float32 numpy array of shape (N, dim).
texts = [
    "Force majeure clauses excuse performance.",
    "Indemnity caps the liability of the seller.",
]
vecs = model.embed(texts)
assert vecs.shape == (2, 384) and vecs.dtype == np.float32

# Cosine similarity over the L2-normalized rows.
def cosine(a, b):
    return float(np.dot(a / np.linalg.norm(a), b / np.linalg.norm(b)))

print(f"sim: {cosine(vecs[0], vecs[1]):.3f}")
# sim: 0.637   (similar legal-contract topic, distinct concepts)

For retrieval over a corpus, build an EmbeddingRetriever:

import asyncio

from kaos_nlp_transformers import EmbeddingRetriever

retriever = EmbeddingRetriever.from_texts(
    texts=[
        "The buyer agrees to mediation in Delaware.",
        "All disputes shall be resolved by arbitration in New York.",
        "Force majeure clauses excuse performance.",
    ],
    doc_ids=[0, 1, 2],
)
hits = asyncio.run(retriever.retrieve("where do contract disputes go?", top_k=2))
for h in hits:
    print(f"{h.score:.3f}  {h.text}")

Concepts

The package is built around a small set of typed primitives.

Concept What it is
EmbeddingModel The single entry point for inference. EmbeddingModel.load(model_id, *, device=None, backend=None, settings=None) resolves the registry entry, picks a backend (fastembed for ONNX models on CPU/GPU, model2vec for static lookup models), and returns an instance with an .embed(texts, *, batch_size=32) -> np.ndarray method. Backends are process-cached by (model_id, revision, device, cache_dir) so repeated load() calls are O(1).
RegisteredModel / REGISTRY / EXCLUDED Curated, license-vetted model catalog. Each entry pins a HuggingFace Hub commit SHA (audit-01 KNT-003: revisions thread through the loader cache key). The EXCLUDED map names models intentionally rejected with their licensing reason — jina-v3 (CC-BY-NC), NV-Embed (CC-BY-NC), Qwen3-Embedding (MS MARCO ambiguity). v0 ships BAAI/bge-small-en-v1.5 (33M, MIT, fastembed) plus three model2vec entries (potion-base-8M, potion-base-32M, potion-retrieval-32M). potion-base-8M is vendored inside the wheel (~28 MB), so it loads offline with no network.
EmbeddingRetriever Brute-force cosine similarity search over a numpy matrix. from_texts(...) and from_corpus(...) factories. For corpora up to ~50K documents this is faster than FAISS overhead. Implements the kaos_nlp_core.search.SearchHit protocol.
CrossEncoderReranker Optional second-pass reranker via fastembed.TextCrossEncoder (default BAAI/bge-reranker-base, MIT). No extra required for CPU; [gpu] accelerates on CUDA. Use to refine EmbeddingRetriever top-50 → top-10. Sigmoid-normalized scores in [0, 1].
SemanticDedupLevel Plug-in for kaos-content's deduplication framework. Embeds documents, computes pairwise cosine distance with scipy.spatial.distance.pdist, and clusters with scipy.cluster.hierarchy.fcluster. Requires the [clustering] extra.
KaosNLPTransformersSettings Typed settings (env prefix KAOS_NLP_TRANSFORMERS_): default_model, default_reranker_model, cache_dir, offline, allow_unregistered, device, backend, profile. Honors legacy HF_HUB_OFFLINE and HF_HOME. When offline=True, the load path sets HF_HUB_OFFLINE=1 and TRANSFORMERS_OFFLINE=1 (audit-01 KNT-005).
Device detection detect_devices() returns a SystemDevices snapshot (reachable accelerators + ONNX execution providers + latent GPUs the OS sees but the install can't drive). EmbeddingModel.load(device="auto") picks the best available; explicit "cpu" / "cuda" / "cuda:0" / "openvino" are honored. Audit-06 KNT-501 retired mps and xla alongside the torch backend.

CLI

kaos-nlp-transformers ships a kaos-nlp-transformers administrative CLI (info subcommand) plus a kaos-nlp-transformers-serve MCP server launcher that requires the [mcp] extra:

kaos-nlp-transformers info --json    # version + registry + device snapshot
kaos-nlp-transformers-serve          # stdio MCP server (requires [mcp])

Compatibility & status

Aspect
Python 3.13, 3.14 — GIL builds only. Free-threaded builds (3.13t / 3.14t / Py_GIL_DISABLED) are not supported: EmbeddingModel.load / CrossEncoderReranker.load raise BackendNotInstalledError because fastembed's transitive py_rust_stemmers and tokenizers C extensions segfault during module init without the GIL. Pending upstream Py_GIL_DISABLED declarations from those extensions; the guard is removed once that lands. Pure-Python py3-none-any wheel.
OS Any platform with a CPython 3.13+ wheel and ONNX Runtime support — Linux x86_64 + aarch64 (manylinux), macOS x86_64 + arm64, Windows x86_64.
Maturity Alpha. The public API is documented in kaos_nlp_transformers.__all__.
Stability policy Pre-1.0: minor bumps may change behaviour. Every change is documented in CHANGELOG.md.
Test coverage 138 unit tests + 24 integration tests (162 total, 77% line coverage). Integration suite hits real fastembed embedding + cross-encoder reranker downloads — no mocks. GPU tests gated on the gpu marker; reranker live tests on live.
Type checker Validated with ty, Astral's Python type checker.

Companion packages

kaos-nlp-transformers is one of the packages in the Kelvin Agentic OS. The broader stack:

Package Layer What it does
kaos-core Core Foundational runtime, MCP-native types, registries, execution engine, VFS
kaos-content Core Typed document AST: Block/Inline, provenance, views
kaos-mcp Bridge FastMCP server, kaos management CLI, MCP resource templates
kaos-pdf Extraction PDF → AST with provenance
kaos-web Extraction Web extraction, browser automation, search, domain intelligence
kaos-office Extraction DOCX / PPTX / XLSX readers + writers to AST
kaos-tabular Extraction DuckDB-powered SQL analytics
kaos-source Data Government + financial data connectors (Federal Register, eCFR, EDGAR, GovInfo, PACER, GLEIF)
kaos-llm-client LLM Multi-provider LLM transport
kaos-llm-core LLM Typed LLM programming (Signatures, Programs, Optimizers)
kaos-nlp-core Primitives (Rust) High-performance NLP primitives
kaos-nlp-transformers ML Dense embeddings + retrieval
kaos-graph Primitives (Rust) Graph algorithms + RDF/SPARQL
kaos-ml-core Primitives (Rust) Classical ML on the document AST
kaos-citations Legal Legal citation extraction, resolution, verification
kaos-agents Agentic Agent runtime, memory, recipes
kaos-reference Sample Reference module for module authors

Packages depend on kaos-core; everything else is opt-in. Mix and match the ones you need.

Development

git clone https://github.com/273v/kaos-nlp-transformers
cd kaos-nlp-transformers
uv sync --group dev --extra clustering

Install pre-commit hooks (recommended — they run the same checks as CI on every commit, scoped to staged files):

uvx pre-commit install
uvx pre-commit run --all-files     # one-time full sweep

Manual QA commands (the same set CI runs):

uv run ruff format --check kaos_nlp_transformers tests
uv run ruff check kaos_nlp_transformers tests
uv run ty check kaos_nlp_transformers tests
uv run pytest tests/unit -q

Build from source

uv build
uv pip install dist/*.whl

Contributing

Issues and pull requests are welcome. See CONTRIBUTING.md for setup, quality gates, pull request expectations, and engineering standards. By contributing you agree to follow the project conduct expectations and certify the Developer Certificate of Origin v1.1 — sign every commit with git commit -s. Please open an issue before starting on a non-trivial change so we can align on scope.

Security

For security issues, please do not file a public issue. Report privately via GitHub Private Vulnerability Reporting or email security@273ventures.com. See SECURITY.md for the full disclosure policy.

License

Apache License 2.0 — see LICENSE and NOTICE.

Copyright 2026 273 Ventures LLC. Built for kelvin.legal.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kaos_nlp_transformers-0.2.0a3.tar.gz (28.8 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

kaos_nlp_transformers-0.2.0a3-cp313-abi3-win_arm64.whl (37.2 MB view details)

Uploaded CPython 3.13+Windows ARM64

kaos_nlp_transformers-0.2.0a3-cp313-abi3-win_amd64.whl (37.7 MB view details)

Uploaded CPython 3.13+Windows x86-64

kaos_nlp_transformers-0.2.0a3-cp313-abi3-manylinux_2_28_x86_64.whl (38.5 MB view details)

Uploaded CPython 3.13+manylinux: glibc 2.28+ x86-64

kaos_nlp_transformers-0.2.0a3-cp313-abi3-manylinux_2_28_aarch64.whl (39.4 MB view details)

Uploaded CPython 3.13+manylinux: glibc 2.28+ ARM64

kaos_nlp_transformers-0.2.0a3-cp313-abi3-macosx_11_0_arm64.whl (37.2 MB view details)

Uploaded CPython 3.13+macOS 11.0+ ARM64

File details

Details for the file kaos_nlp_transformers-0.2.0a3.tar.gz.

File metadata

  • Download URL: kaos_nlp_transformers-0.2.0a3.tar.gz
  • Upload date:
  • Size: 28.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for kaos_nlp_transformers-0.2.0a3.tar.gz
Algorithm Hash digest
SHA256 1647f0332fcd257f56cb34b316688bf16531cf2dd0a1dc8ad82a0034671737fd
MD5 b990fe9ab0a89eb0f28fd3db1c7284dd
BLAKE2b-256 2f44ee16c7d00dc296484e036f135ca81dae2c0ce95d3d2c6143e9409642b1c1

See more details on using hashes here.

Provenance

The following attestation bundles were made for kaos_nlp_transformers-0.2.0a3.tar.gz:

Publisher: release.yml on 273v/kaos-nlp-transformers

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kaos_nlp_transformers-0.2.0a3-cp313-abi3-win_arm64.whl.

File metadata

File hashes

Hashes for kaos_nlp_transformers-0.2.0a3-cp313-abi3-win_arm64.whl
Algorithm Hash digest
SHA256 cdc70714af682f92fa3ce669414c7a3ac2282c9becfde83efc0b49b34cef5e08
MD5 753338871d4caaa96830ca1beb660740
BLAKE2b-256 7fe748b4975826ce20f0fb4eaa4c85501239de9fca5f9f35fd09790291494bc2

See more details on using hashes here.

Provenance

The following attestation bundles were made for kaos_nlp_transformers-0.2.0a3-cp313-abi3-win_arm64.whl:

Publisher: release.yml on 273v/kaos-nlp-transformers

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kaos_nlp_transformers-0.2.0a3-cp313-abi3-win_amd64.whl.

File metadata

File hashes

Hashes for kaos_nlp_transformers-0.2.0a3-cp313-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 0b8cd98618895dee200d10084b0971156264b0bb728834a88c1d04dfb5cb426f
MD5 f5d347b957c17991a3c4c3c114c0ad98
BLAKE2b-256 7920488609482955bdb297c0c97aa4c2179d6bb1cf0bfa0f76a7d8fee2e72642

See more details on using hashes here.

Provenance

The following attestation bundles were made for kaos_nlp_transformers-0.2.0a3-cp313-abi3-win_amd64.whl:

Publisher: release.yml on 273v/kaos-nlp-transformers

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kaos_nlp_transformers-0.2.0a3-cp313-abi3-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for kaos_nlp_transformers-0.2.0a3-cp313-abi3-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 85908e09c1bbdacc04ee942d840f100377c5890bfd3422cca0ccd35be57022c9
MD5 fd15b5d283784cd0c32fdfe9a123f33a
BLAKE2b-256 3d373ce91fb6d8c791b6efaa41a545b328881866b35292dbfe68162e126b7e85

See more details on using hashes here.

Provenance

The following attestation bundles were made for kaos_nlp_transformers-0.2.0a3-cp313-abi3-manylinux_2_28_x86_64.whl:

Publisher: release.yml on 273v/kaos-nlp-transformers

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kaos_nlp_transformers-0.2.0a3-cp313-abi3-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for kaos_nlp_transformers-0.2.0a3-cp313-abi3-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 74c4a9a29324d1bca26f7d93a418cd75fa8075ccc54c1b2ab1c9c372f3e9f3e8
MD5 f13f7fb6586cb5b193860f6f21fbf0cc
BLAKE2b-256 e41a187bb9b39e0cecfe5749371c0dc5efeabb265d3b79e3312c36fd92041fa8

See more details on using hashes here.

Provenance

The following attestation bundles were made for kaos_nlp_transformers-0.2.0a3-cp313-abi3-manylinux_2_28_aarch64.whl:

Publisher: release.yml on 273v/kaos-nlp-transformers

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kaos_nlp_transformers-0.2.0a3-cp313-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for kaos_nlp_transformers-0.2.0a3-cp313-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 afe4ee9b7d49112cea1b1cb58e538c41b15112fa2d26c341e2dbf9368d91a1b9
MD5 549edf5c7571db6a26128f3570f62a74
BLAKE2b-256 043cd484e4cd40ee54f5ac7a08944991d2f81ed079cb556fcee25f1db82d4df4

See more details on using hashes here.

Provenance

The following attestation bundles were made for kaos_nlp_transformers-0.2.0a3-cp313-abi3-macosx_11_0_arm64.whl:

Publisher: release.yml on 273v/kaos-nlp-transformers

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page