Skip to main content

Embedded vector database using the TurboQuant algorithm (arXiv:2504.19874) — zero training, 2-4 bit compression, fast inner-product search

Project description

TurboQuantDB

License PyPI

An embedded vector database written in Rust with Python bindings, implementing the TurboQuant algorithm (arXiv:2504.19874) — zero training time, 2–4 bit compression, and provably unbiased inner product estimation.

Goal: make massive embedding datasets practical on lightweight hardware. A 50k-vector, 1536-dim collection that would occupy 293 MB as raw float32 fits in 70 MB on disk and 488 MB of RAM with TQDB b=4 — enabling laptop-scale RAG over millions of documents without a dedicated server.

Two deployment modes:

  • Embeddedtqdb Python package (pip install tqdb), runs in-process (no daemon)
  • Server — Axum HTTP service in server/, with multi-tenancy, RBAC, quotas, and async jobs

Key Properties

  • Zero training — No train() step. Vectors are quantized and stored immediately on insert.
  • 4.2× compression — Reduces float32 embeddings to 2–4 bits per coordinate; b=4 stores each vector in 1,466 bytes vs 6,144 bytes for float32.
  • Unbiased scoring — QJL transform guarantees unbiased inner product estimation.
  • Optional ANN index — Build an HNSW graph after loading data for fast approximate search.
  • Metadata filtering — MongoDB-style filter operators on any metadata field.
  • Crash recovery — Write-ahead log (WAL) ensures durability without explicit flushing.
  • Python native — Built with PyO3 and Maturin; no server or sidecar required.

Installation

Prerequisites

  • Rust stable toolchain
  • Python 3.10+
  • C++ compiler: Visual Studio Build Tools (Windows) · xcode-select --install (macOS) · build-essential (Linux)

Build from source

python -m venv venv
source venv/bin/activate        # Windows: .\venv\Scripts\activate
pip install maturin
maturin develop --release

Install pre-built wheel

pip install tqdb

Recommended Setup

Three presets covering the main use cases — pick one and you're ready:

from turboquantdb import Database

# High Quality — best recall (~97% at 50k×1536)
db = Database.open(path, dimension=DIM, bits=8, rerank=True)
db.create_index(max_degree=32, ef_construction=200, n_refinements=8)
results = db.search(query, top_k=10, ann_search_list_size=200)

# Balanced — recommended default (~89% recall, 5.7× smaller than float32)
db = Database.open(path, dimension=DIM, bits=4, rerank=True)
db.create_index(max_degree=32, ef_construction=200, n_refinements=5)
results = db.search(query, top_k=10, ann_search_list_size=128)

# Fast Build — fastest ingest (~83% recall, same disk as Balanced)
db = Database.open(path, dimension=DIM, bits=4, fast_mode=True, rerank=False)
db.create_index(max_degree=32, ef_construction=200, n_refinements=5)
results = db.search(query, top_k=10, ann_search_list_size=128)

Full parameter reference: docs/PYTHON_API.md


Quick Start

import numpy as np
from turboquantdb import Database

db = Database.open("./my_db", dimension=1536, bits=4, metric="ip", rerank=True)

db.insert("doc-1", np.random.randn(1536).astype("f4"), metadata={"topic": "ml"}, document="Machine learning intro")
db.insert("doc-2", np.random.randn(1536).astype("f4"), metadata={"topic": "systems"}, document="Rust memory model")

results = db.search(np.random.randn(1536).astype("f4"), top_k=5)
for r in results:
    print(r["id"], r["score"], r["document"])

Python API

Full reference: docs/PYTHON_API.md

# Open / create
db = Database.open(path, dimension, bits=4, seed=42, metric="ip",
                   rerank=True, fast_mode=False, rerank_precision=None,
                   collection=None)   # collection= → opens path/collection/

# Write
db.insert(id, vector, metadata=None, document=None)
db.insert_batch(ids, vectors, metadatas=None, documents=None, mode="insert")  # "insert"|"upsert"|"update"
db.upsert(id, vector, metadata=None, document=None)
db.update(id, vector, metadata=None, document=None)        # RuntimeError if not found
db.update_metadata(id, metadata=None, document=None)       # RuntimeError if not found

# Delete & retrieve
db.delete(id)                        # → bool
db.delete_batch(ids)                 # → int (count deleted)
db.get(id)                           # → {id, metadata, document} | None
db.get_many(ids)                     # → list[dict | None]
db.list_all()                        # → list[str]
db.list_ids(where_filter=None, limit=None, offset=0)       # paginated
db.count(filter=None)                # → int
db.stats()                           # → dict
len(db) / "id" in db                 # container protocol

# Search
results = db.search(query, top_k=10, filter=None, _use_ann=True,
                    ann_search_list_size=None, include=None)
# include: list of "id"|"score"|"metadata"|"document" (default all)

all_results = db.query(query_embeddings, n_results=10, where_filter=None)
# query_embeddings: np.ndarray (N, D) — returns list[list[dict]]

# Index
db.create_index(max_degree=32, ef_construction=200, n_refinements=5,
                search_list_size=128, alpha=1.2)

# Metadata filter operators
# $eq $ne $gt $gte $lt $lte $in $nin $exists $contains $and $or
db.search(query, top_k=5, filter={"year": {"$gte": 2023}})
db.search(query, top_k=5, filter={"$and": [{"topic": "ml"}, {"year": {"$gte": 2023}}]})

Recommended Presets

High Quality — recall matters most

db = Database.open(path, dimension=DIM, bits=8, rerank=True)
db.create_index(max_degree=32, ef_construction=200, n_refinements=8)
results = db.search(query, top_k=10, ann_search_list_size=200)
# ~97% Recall@10 at 50k×1536  |  ~38s ingest  |  119 MB disk

Balanced — default recommendation

db = Database.open(path, dimension=DIM, bits=4, rerank=True)
db.create_index(max_degree=32, ef_construction=200, n_refinements=5)
results = db.search(query, top_k=10, ann_search_list_size=128)
# ~89% Recall@10 at 50k×1536  |  ~26s ingest  |  70 MB disk

Fast Build — ingest speed is priority

db = Database.open(path, dimension=DIM, bits=4, fast_mode=True, rerank=False)
db.create_index(max_degree=32, ef_construction=200, n_refinements=5)
results = db.search(query, top_k=10, ann_search_list_size=128)
# ~83% Recall@10 at 50k×1536  |  ~22s ingest  |  70 MB disk

Benchmarks

Full results: BENCHMARKS.md

Highlights (50k × 1536, top_k=10, DBpedia OpenAI embeddings):

Engine Ingest Disk RAM p50 Recall@10
Engine-A (HNSW) 30.6s 398 MB 865 MB 1.73ms 99.75%
Engine-B (IVF-PQ) 77.6s 318 MB 526 MB 9.17ms 79.50%
TQDB b=8 HQ 59.2s 119 MB 537 MB 12.42ms 97.25%
TQDB b=4 Balanced 37.8s 70 MB 488 MB 9.98ms 89.15%
TQDB b=4 FastBuild 28.2s 70 MB 487 MB 5.30ms 83.35%

Engines identified by index algorithm. Reproduction scripts in benchmarks/.

TQDB b=4 is 5.7× smaller than Engine-A and uses 44% less RAM, within 11pp of recall.


RAG Integration

from turboquantdb.rag import TurboQuantRetriever

retriever = TurboQuantRetriever(db_path="./rag_db", dimension=1536, bits=4)
retriever.add_texts(texts=texts, embeddings=embeddings, metadatas=metadatas)

results = retriever.similarity_search(query_embedding=query_vec, k=5)
for r in results:
    print(r["score"], r["text"])

Architecture

TurboQuantDB is an embedded database — it runs in-process with no daemon.

./my_db/
├── manifest.json        — DB config (dimension, bits, seed, metric)
├── quantizer.bin        — Serialized quantizer state
├── live_codes.bin       — Memory-mapped quantized vectors (hot path)
├── live_vectors.bin     — Raw vectors for exact reranking (only if rerank_precision="f16" or "f32")
├── wal.log              — Write-ahead log
├── metadata.bin         — Per-vector metadata and documents
├── live_ids.bin         — ID → slot index
├── graph.bin            — HNSW adjacency list (if index built)
└── seg-XXXXXXXX.bin     — Immutable flushed segment files

Write path: insert() → quantize (QR rotation → MSE → Gaussian QJL) → WAL → live_codes.bin → flush to segment

Search (brute-force): query → precompute lookup tables → score all live vectors → top-k

Search (ANN): query → HNSW beam search → rerank → top-k

Quantization: Two-stage pipeline:

  1. MSE — QR rotation + Lloyd-Max scalar quantization to bits per coordinate
  2. QJL — Dense Gaussian projection, 1-bit quantized, bit-packed

The combination gives unbiased inner product estimates with near-optimal distortion, requiring no training data.

What comes from the paper vs. what is added here

The TurboQuant paper contributes the quantization algorithm — how to compress vectors and estimate inner products accurately. Its experiments use flat (exhaustive) search: all database vectors are scored against every query using the LUT-based asymmetric scorer. The paper's "indexing time virtually zero" claim refers to the quantizer requiring no training data, not to graph construction.

From the paper: two-stage MSE + QJL quantization, QR rotation, Lloyd-Max codebook, asymmetric LUT scoring, unbiased inner product estimation.

Added by TurboQuantDB (not in the paper): WAL persistence, memory-mapped storage, metadata/documents, HNSW graph index, reranking, Python bindings, and the HTTP server.

The brute-force search path (_use_ann=False) is the paper-conformant mode — it scores all vectors using TurboQuant's LUT scorer, matching the paper's experimental setup exactly. The HNSW index is a practical engineering addition that reduces the candidate set before scoring, enabling sub-linear search at the cost of approximate recall.

Module Map

Path Responsibility
src/python/mod.rs Database class — Python-facing API
src/storage/engine.rs TurboQuantEngine — insert/search/delete orchestration
src/storage/wal.rs Write-ahead log
src/storage/segment.rs Immutable append-only segments
src/storage/live_codes.rs Memory-mapped hot vector cache
src/storage/graph.rs HNSW graph index
src/quantizer/prod.rs ProdQuantizer — MSE + QJL orchestrator
src/quantizer/mse.rs MseQuantizer — QR rotation + Lloyd-Max codebook
src/quantizer/qjl.rs QjlQuantizer — 1-bit Gaussian projection, bit-packed
python/turboquantdb/rag.py TurboQuantRetriever — LangChain-style wrapper
server/ Optional Axum HTTP service (separate Cargo workspace)

Server Mode

Status: experimental. The server crate compiles and the core endpoints work, but it has not been hardened for production use. The embedded library (tqdb Python package, from turboquantdb import Database) is the primary supported interface.

An optional Axum-based HTTP server is available in server/ for multi-tenant deployments. It adds API key authentication, quota enforcement, and async job management (compaction, index building, snapshots).

cd server && cargo build --release
TQ_SERVER_ADDR=0.0.0.0:8080 TQ_LOCAL_ROOT=./data ./target/release/turboquantdb-server

See server/README.md for the full endpoint reference. Key env vars:

Variable Default Description
TQ_SERVER_ADDR 127.0.0.1:8080 Bind address
TQ_LOCAL_ROOT ./data Storage root
TQ_JOB_WORKERS 2 Async job thread count

Performance Roadmap

The current implementation already uses AVX2 SIMD for FWHT, the MSE centroid scan, and the QJL bit-unpack inner product.

GPU acceleration — batch ingest would benefit from cuBLAS GEMM (~3–5× for large batches on high-end cards). The ANN search path is memory-bound, not compute-bound, so GPU benefit there is minimal; the bottleneck is random cache misses during HNSW graph traversal rather than floating-point throughput.

AVX-512 codebook scan — on modern Intel CPUs the MSE centroid lookup can be vectorised 2× wider with AVX-512, potentially halving scoring latency per batch.

Persistent HNSW — incremental graph updates (no full rebuild after each ingest batch) would allow streaming use cases without periodic create_index() calls.


Research Basis

This is an independent implementation of ideas from the TurboQuant paper. The algorithm itself was authored by the original researchers.

Zandieh, A., Daliri, M., Hadian, M., & Mirrokni, V. (2025). TurboQuant: Online Vector Quantization with Near-optimal Distortion Rate. arXiv:2504.19874

@article{zandieh2025turboquant,
  title={TurboQuant: Online Vector Quantization with Near-optimal Distortion Rate},
  author={Zandieh, Amir and Daliri, Majid and Hadian, Majid and Mirrokni, Vahab},
  journal={arXiv preprint arXiv:2504.19874},
  year={2025}
}

License

Apache License 2.0 — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tqdb-0.1.0.tar.gz (144.8 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

tqdb-0.1.0-cp314-cp314-win_amd64.whl (709.3 kB view details)

Uploaded CPython 3.14Windows x86-64

tqdb-0.1.0-cp314-cp314-macosx_11_0_arm64.whl (809.9 kB view details)

Uploaded CPython 3.14macOS 11.0+ ARM64

tqdb-0.1.0-cp314-cp314-macosx_10_12_x86_64.whl (831.7 kB view details)

Uploaded CPython 3.14macOS 10.12+ x86-64

tqdb-0.1.0-cp313-cp313-win_amd64.whl (709.3 kB view details)

Uploaded CPython 3.13Windows x86-64

tqdb-0.1.0-cp313-cp313-manylinux_2_28_aarch64.whl (882.4 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

tqdb-0.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (907.8 kB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

tqdb-0.1.0-cp313-cp313-macosx_11_0_arm64.whl (809.9 kB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

tqdb-0.1.0-cp313-cp313-macosx_10_12_x86_64.whl (831.7 kB view details)

Uploaded CPython 3.13macOS 10.12+ x86-64

tqdb-0.1.0-cp312-cp312-win_amd64.whl (709.3 kB view details)

Uploaded CPython 3.12Windows x86-64

tqdb-0.1.0-cp312-cp312-manylinux_2_28_aarch64.whl (882.4 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

tqdb-0.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (907.8 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

tqdb-0.1.0-cp312-cp312-macosx_11_0_arm64.whl (809.9 kB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

tqdb-0.1.0-cp312-cp312-macosx_10_12_x86_64.whl (831.7 kB view details)

Uploaded CPython 3.12macOS 10.12+ x86-64

tqdb-0.1.0-cp311-cp311-win_amd64.whl (710.8 kB view details)

Uploaded CPython 3.11Windows x86-64

tqdb-0.1.0-cp311-cp311-manylinux_2_28_aarch64.whl (882.5 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

tqdb-0.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (907.0 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

tqdb-0.1.0-cp311-cp311-macosx_11_0_arm64.whl (811.9 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

tqdb-0.1.0-cp311-cp311-macosx_10_12_x86_64.whl (832.7 kB view details)

Uploaded CPython 3.11macOS 10.12+ x86-64

tqdb-0.1.0-cp310-cp310-win_amd64.whl (713.5 kB view details)

Uploaded CPython 3.10Windows x86-64

tqdb-0.1.0-cp310-cp310-manylinux_2_28_aarch64.whl (882.7 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

tqdb-0.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (907.4 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

tqdb-0.1.0-cp310-cp310-macosx_11_0_arm64.whl (813.9 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

tqdb-0.1.0-cp310-cp310-macosx_10_12_x86_64.whl (834.7 kB view details)

Uploaded CPython 3.10macOS 10.12+ x86-64

File details

Details for the file tqdb-0.1.0.tar.gz.

File metadata

  • Download URL: tqdb-0.1.0.tar.gz
  • Upload date:
  • Size: 144.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.12.6

File hashes

Hashes for tqdb-0.1.0.tar.gz
Algorithm Hash digest
SHA256 c561d5691dd3e988fcbf327d0c703b2cf23c5c15fdc799f19ec4f1538dcae865
MD5 6b3836a4575e69fc84427feb6847665d
BLAKE2b-256 de9f17dc31ffed0480e2affa5abaf2d9ba144371e4ca78ed57fc25a73a97ea05

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp314-cp314-win_amd64.whl.

File metadata

  • Download URL: tqdb-0.1.0-cp314-cp314-win_amd64.whl
  • Upload date:
  • Size: 709.3 kB
  • Tags: CPython 3.14, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.12.6

File hashes

Hashes for tqdb-0.1.0-cp314-cp314-win_amd64.whl
Algorithm Hash digest
SHA256 71850f796c02197db974c5f6bc27f7010a7c7f8766f0c2211b24a558a64d7090
MD5 d5044756ce01b0185d62f1657c341355
BLAKE2b-256 452707d8adf84413bc22f570a4a4495eeaa2c919e474329d35ebfba20a0cef11

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp314-cp314-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp314-cp314-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 473f732c50d03a204edf27dd5e54ed5038bd61c1a8b0b222b0d183816080e154
MD5 3a9c67e8f01821427fdbd65312cc4f67
BLAKE2b-256 17431963622fee360a32780d1b3a0b347f3df3c354ea86c01e19514ff8a68120

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp314-cp314-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp314-cp314-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 e49c60835d70b34f45464ad4600684a011e8c7d439b353ea6d3bea4260bff58f
MD5 103137af50787b402a11f68b68ffa7ac
BLAKE2b-256 1094e7813ca3dda6ed87638acc648efb179d69f609906817ab4e3556f0fc8254

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: tqdb-0.1.0-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 709.3 kB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.12.6

File hashes

Hashes for tqdb-0.1.0-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 f0aec72f62c46b2fc005e80731336aa2b1dfb6d9d05043b05afa6a940d3150b7
MD5 dc22869ff0cca6f5c709cd8195605c53
BLAKE2b-256 75998c72ce419b8f3e8757857117ae4bd5c77c59bcd8e568693f4f2c767c3cf4

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 4e28c2291de9a785d80f2327d7c58921ae61dbc45022af3a35f263b95b796f88
MD5 c13bccb405dd43367baa305c86725052
BLAKE2b-256 7edbbb16d3ab20e1b418cefc1cad69014e5a3f06f80be267e3241d331edce74c

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ca5d87444497db8f203af1d6f85ed4866c19445dff0c0cc4b5d199b9ae86e25f
MD5 df7a51b292d1cd5e415796134db7d7f7
BLAKE2b-256 aaf05b55280c73ff2d49bf760cb023b2b0b3f0cf3c0e177e6fa3af21aabb4ae2

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 64cf9496449751f961d5e0789f6391806a40621c6bb093a1bac917440b27226d
MD5 7429c2d61ac3b69b5e3f6ca6e316d4f5
BLAKE2b-256 af56d06ad9c19552a9bbeae38ccbfe0edf19c6b3ce3c0f2e3b08b8b142e7e27c

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp313-cp313-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp313-cp313-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 1d9c21b08db70335b51e52e4e213defdb94321b0fbd44a173f12640675f1b951
MD5 06877e8453cba228632073f2e61191a2
BLAKE2b-256 66153da30b90be36b5295bb81faea57b408df82d6baa2a577449b8fdd77ad508

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: tqdb-0.1.0-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 709.3 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.12.6

File hashes

Hashes for tqdb-0.1.0-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 ea3a3f317fef01db383be8a5b584ae9f7fe3c3352501a10ca08c60a820035d3b
MD5 271ced84d8ed722b2d591973ca5fb402
BLAKE2b-256 342a3ff894aa8ba75b02a250a7d5e2c91d645dab41657e798de3a6c3a3dad782

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 bdd7def5aadb2c8bbfa625eb0e8939b4a5f2f202c4c8fa2ecc03cf786cd78096
MD5 59cfcf070a6c6981313df9bf2e2953be
BLAKE2b-256 745286ad285db311f7da7db17b767cec50b096124376022d514581810f216fd6

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 67c23dae3de7ab4a681d017947fe6acb7516b53c44efad6711896dac369baa02
MD5 1fed59ac22a8024aea5c748a0c978dc6
BLAKE2b-256 620908902d5ef4109eb2bf6fc518fa5c4a4ea1ac21c0c44955d3961decb4f5ae

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 659bd2a1063604752131d5f2af4b54caaa987c0673f22801dc8a0bdcc5735cc5
MD5 cc6bf0f4ea9b0860cec4ad71aa3804e5
BLAKE2b-256 11b9b20b66585e05f283eb7f8a779e11959bc18a496b47720e8690a7da649d9d

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp312-cp312-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp312-cp312-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 1b66e2bead608de4743ea3d373e02570b433f6440530e1a108d49dd87ec8d0ae
MD5 10828e71ab1ba335e89bba123214cbc5
BLAKE2b-256 c8abdb4271118a78f083d9484fc9c3e42acf6695f3757cf2908ef8c7f4d8e945

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: tqdb-0.1.0-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 710.8 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.12.6

File hashes

Hashes for tqdb-0.1.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 dd7bc180a7335c84da3dd482725993ecfe70b3272ae714565a35bae1967a5986
MD5 69bc384761b026e6c51eaba8cfdd8024
BLAKE2b-256 ce2b3d75e04149562d4c4286e30794c6e9f500a160c883bdb71c6acc2a168eca

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 7fa94a0d0e1312b5c46f11858928817a99a4ba5818c93cc5ee4f9bdbb8d92b5d
MD5 dee938352b6c0dd3fb06eb89f23a738a
BLAKE2b-256 0455274271d22a4fa0c46d7100d79ee4a8e6b06326abd15bbc1d38ff162dcb85

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 90839c6f2f9a5a5af7370cd2cd7d33f8ec4b6985211efc74babbda6836904e99
MD5 98610714265162308ebdc50f8b613536
BLAKE2b-256 95eaf2fc2058ef6c767621cc218ef453ede9252575937d57691a0732547842d0

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 7c94575bc2d88020db2864214e536c6965ed48cde94ebc81bfc8cd38bfe54fe7
MD5 5b1a056d1da75eeee67f25990d077dd2
BLAKE2b-256 c226c8991a343d980bc23e2c61210c05a0d7f6d20b158077aa28e6e315a35056

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp311-cp311-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp311-cp311-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 b7f4af46e1625b002f8106157970489187276af1e83fb6d7b56dded7ba7ee00e
MD5 c4855abf22603302fa7b6094a49d1248
BLAKE2b-256 08aaf0eea1570315ab25663fb9cc4969480ce4c0ac6f1a343f096d76a343e2a6

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: tqdb-0.1.0-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 713.5 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.12.6

File hashes

Hashes for tqdb-0.1.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 a0fff3aeedbcd8d0929c6dbd4ae0cbd198224d184cdae07f887e2049292635a9
MD5 828ea58235dc1ec2e3f7de348497364c
BLAKE2b-256 f9319cd10e9ea5ab99de1c5756dfb13fb4db797f92d724cf0cd01beeb26653f2

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 908c5367a31a201bf13d131ef6582547b71fe6f4f8c6b18ebf8f5bd99908b9b1
MD5 eaa3b18f18272023dd13e6af901fd6c2
BLAKE2b-256 9d6cd647c23d4a5efc805e97785f25af41ca9aaeb1b65c1a7747b7a6562ccd5c

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 c8bf48bb03bdd66793e03adac3209bddd38c1008519cab1343e3c4ab67012c16
MD5 3081cd75536ea965120a5302767cdb68
BLAKE2b-256 d822f1df88cd2ce81f4f7fe1c1b58114fafce5baa231006c7d0f8c5e293fe49a

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 8c2d9a9f108ba14c4c8f0ff6a6a500eeaa1a37d81be3d3a05c25a7d02804cc76
MD5 112c239e5eb9f1dcd333ada93449d4d9
BLAKE2b-256 80c82d3b9b5c8a6376f16c1e316a3ad0d466952f8bd649b896a9ed6e5483c394

See more details on using hashes here.

File details

Details for the file tqdb-0.1.0-cp310-cp310-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for tqdb-0.1.0-cp310-cp310-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 928911ddf71e66e454ca958f2fde4d443dcdb5754427a3e100f9dca21be5ce58
MD5 53775d0235482c20869dea4540e16249
BLAKE2b-256 d9b982ead2682782706be18931855e550813c648b5e5c458cdc373166021d40c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page