A brain for AI agents that actually learns — confidence scoring, negative results, gap detection. pip install attestdb && attest brain install
Project description
AttestDB
The truth layer for AI agents. Every fact has a source, a confidence score, and gets flagged when the source is wrong.
pip install attestdb
The Problem
Every AI system has a trust problem. Agents hallucinate. Data sources contradict each other. Nobody can trace where a "fact" came from or when it went stale.
Vector databases find similar text. Graph databases store edges. Neither tracks whether what they're storing is still true.
AttestDB stores claims — facts with sources, confidence scores, and timestamps. When a source is wrong, one call retracts it and everything downstream is flagged automatically. When two sources disagree, both claims coexist until evidence resolves them. When data goes stale, the system knows.
See It Work
from attestdb import AttestDB
db = AttestDB("demo.attest", embedding_dim=None)
# Ingest claims from different sources
db.ingest(
subject=("api-gateway", "service"),
predicate=("depends_on", "relates_to"),
object=("redis", "service"),
provenance={"source_id": "k8s-manifest-v2.3", "source_type": "infrastructure"},
confidence=1.0,
)
db.ingest(
subject=("api-gateway", "service"),
predicate=("handles", "relates_to"),
object=("10k-requests-per-second", "metric"),
provenance={"source_id": "load-test-march-2024", "source_type": "benchmark"},
confidence=0.95,
)
# Query the knowledge graph
frame = db.query("api-gateway", depth=2)
for rel in frame.direct_relationships:
print(f" {rel.predicate} → {rel.target.name} (conf: {rel.confidence})")
# The load test had a bug. Retract everything from that source.
result = db.retract("load-test-march-2024", reason="Bug in test harness")
print(f"Retracted {result.retracted_count} claims")
# Claims from k8s-manifest-v2.3 are untouched. Only the bad source is gone.
# Time travel — see what the graph looked like before the retraction
import time
snapshot = db.at(int(time.time() * 1e9) - int(60 * 1e9)) # 1 minute ago
old_frame = snapshot.query("api-gateway", depth=1)
Why Not a Vector DB?
| AttestDB | Pinecone / Weaviate | Neo4j | PostgreSQL | |
|---|---|---|---|---|
| Atomic unit | Sourced claim | Vector embedding | Edge | Row |
| Provenance | Required on every write | Optional metadata | Optional property | Not built-in |
| Retraction cascade | Automatic | — | Manual | Manual |
| Contradiction handling | Evidence-weighted | Last write wins | Last write wins | Last write wins |
| Confidence scoring | Built-in (0–1) | Similarity score | — | — |
| Query latency | ~12µs | ~10ms | ~5ms | ~1ms |
| MCP tools | 106 | — | — | — |
Features
Retraction Cascades
One bad source? One call fixes everything downstream.
result = db.retract("buggy-sensor-feed", reason="Calibration error discovered")
# Every claim from that source is retracted.
# Claims corroborated by independent sources survive.
Corroboration
The same fact from three independent sources is stronger than one source at high confidence.
# Same fact, different sources — confidence compounds
db.ingest(
subject=("acme-corp", "company"), predicate=("headquartered_in", "relates_to"),
object=("san-francisco", "city"),
provenance={"source_id": "crunchbase", "source_type": "database"}, confidence=0.9,
)
db.ingest(
subject=("acme-corp", "company"), predicate=("headquartered_in", "relates_to"),
object=("san-francisco", "city"),
provenance={"source_id": "linkedin", "source_type": "database"}, confidence=0.85,
)
report = db.corroboration_report(min_sources=2)
Time Travel
Query any point in the past. No backup restores.
# What did we know last Tuesday?
snapshot = db.at(last_tuesday_ns)
frame = snapshot.query("customer-123", depth=2)
Graph Traversal
BFS path finding, profiling, and explanation.
# Find connections between entities
paths = db.find_paths("drug-a", "disease-b", max_depth=3, top_k=5)
for path in paths:
print(f"Confidence: {path.total_confidence}")
# Profile a query
frame, profile = db.explain("BRCA1", depth=2)
print(f"Query: {profile.elapsed_ms:.1f}ms")
Batch Ingestion
Load millions of claims from any source.
from attestdb.core.types import ClaimInput
claims = [
ClaimInput(
subject=("entity-a", "type"), predicate=("rel", "type"), object=("entity-b", "type"),
provenance={"source_id": "dataset-v3", "source_type": "bulk"}, confidence=0.88,
)
for _ in range(100_000)
]
result = db.ingest_batch(claims)
print(f"Ingested: {result.ingested}, Duplicates: {result.duplicates}")
Tamper-Evident Audit
Merkle hash chain on every write. Tamper-evident by construction.
schema = db.schema()
print(f"{schema.total_claims} claims, {schema.total_entities} entities")
print(f"Entity types: {schema.entity_types}")
Give Your AI Agent a Brain
Three commands. Your agent remembers bugs, patterns, and dead ends across sessions.
pip install attestdb
attestdb mcp-config
# Restart Claude Code — your agent now has persistent memory
Works with Claude Code, Cursor, Windsurf, Codex, and any MCP-compatible agent. 106 tools available out of the box — search, ingest, retract, verify, predict, and more.
Under the Hood
Ingestion: 1.3M claims/sec (Rust engine, single-threaded)
Query latency: ~12µs indexed lookups (in-memory), ~122µs LMDB
Storage: Single file — no server, no config, like SQLite
Connectors: 30 (Slack, GitHub, Gmail, Jira, Salesforce, Postgres, etc.)
MCP tools: 106
Production: 85M+ claims, 13M+ entities on reference database
Rust engine. The storage layer is written in Rust (LMDB via heed), exposed to Python through PyO3 bindings. Atomic writes, file locking, CRC32 crash recovery. The Rust crate (attest-py) ships as pre-built wheels for Linux, macOS, and Windows.
Dual ID system. Every claim gets two hashes: claim_id (unique per assertion — includes source and timestamp) and content_id (groups the same fact across sources — enables corroboration).
13 validation rules on every write. Provenance is structural — the engine rejects writes without a valid source chain.
Architecture
attestdb/
core/ — Types, normalization (locked), hashing (locked), confidence, vocabulary
infrastructure/ — AttestDB, ingestion, query engine, embedding index, migration
rust/
attest-core/ — Locked invariants in Rust (normalization, hashing, types)
attest-store/ — LMDB storage engine (append-only claim log, entity store, indexes)
attest-py/ — PyO3 bindings via maturin
tests/
unit/ — Unit tests
integration/ — Integration tests
cross_lang/ — Golden vectors (Python ↔ Rust verification)
Entity normalization is locked and identical across Python and Rust: NFKD unicode → lowercase → collapse whitespace → Greek letters spelled out. 118 golden test vectors verify cross-language consistency.
Pricing
| Open Source | Cloud ($49/mo) | Team ($249/mo) | Enterprise (from $2,500/mo) | |
|---|---|---|---|---|
| Engine | Full Rust engine | Full Rust engine | Full Rust engine | Full Rust engine |
| Claims | Unlimited | 500K | 10M | Unlimited |
| Queries | Unlimited | 500K | 10M | Unlimited |
| Storage | Unlimited | 5 GB | 100 GB | Unlimited |
| Connectors | All 30 | All 30 | All 30 | All 30 |
| MCP tools | 106 | 106 | 106 | 106 |
| Living Database | — | — | Freshness, composites, auto-discovery | Everything in Team |
| RBAC / SSO | — | — | — | SSO/SAML, claim-level ACL |
| Support | Community | Slack (priority) | Dedicated engineer |
The open-source install is the full product — not a demo, not time-limited. Same engine that runs in production.
Links
- Documentation — API reference, connectors, MCP tools
- Quick Start — Up and running in 60 seconds
- Live Demo — 85M+ claims, query anything
- Enterprise — Auto-discovery, entity resolution, ACL
- Manifesto — Why we built this
Contributing
We welcome contributions! See CONTRIBUTING.md for guidelines.
The intelligence layer (LLM-powered features, connectors) is available separately as attestdb-enterprise. The open-source repo contains the full engine, core types, ingestion pipeline, query engine, and all Rust code.
License
Apache License, Version 2.0. Use it, modify it, ship it in your own products — commercial or otherwise. See LICENSE for the full text. Prior releases (≤ 0.1.42) were distributed under the Business Source License 1.1 with Apache-2.0 as the scheduled change license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file attestdb-0.2.0.tar.gz.
File metadata
- Download URL: attestdb-0.2.0.tar.gz
- Upload date:
- Size: 791.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7d952e880c995593faf69173e919204607c38fad71529a586d8adbbf3f5c42dc
|
|
| MD5 |
1346dcdbb627b1336624849b9c1d553c
|
|
| BLAKE2b-256 |
b26d39f5ece52c79b262fb432f2e9d0604a891e3996134d909818d9a2c4ac411
|
Provenance
The following attestation bundles were made for attestdb-0.2.0.tar.gz:
Publisher:
publish.yml on omic/attest
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
attestdb-0.2.0.tar.gz -
Subject digest:
7d952e880c995593faf69173e919204607c38fad71529a586d8adbbf3f5c42dc - Sigstore transparency entry: 1360730148
- Sigstore integration time:
-
Permalink:
omic/attest@ae3b78d57418dfd87b9dfc38f6c63f330f84f6d4 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/omic
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ae3b78d57418dfd87b9dfc38f6c63f330f84f6d4 -
Trigger Event:
release
-
Statement type:
File details
Details for the file attestdb-0.2.0-py3-none-any.whl.
File metadata
- Download URL: attestdb-0.2.0-py3-none-any.whl
- Upload date:
- Size: 644.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6193653a5647d39a1327fd2b8ce8de3a92fdf62c993769da2df52875bd9e55b4
|
|
| MD5 |
15c121a806c2881198025abbb4141562
|
|
| BLAKE2b-256 |
d70d8faef28a838a3e382d342eece7aa6be9a2cb91817e410a16bc35c9162d3f
|
Provenance
The following attestation bundles were made for attestdb-0.2.0-py3-none-any.whl:
Publisher:
publish.yml on omic/attest
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
attestdb-0.2.0-py3-none-any.whl -
Subject digest:
6193653a5647d39a1327fd2b8ce8de3a92fdf62c993769da2df52875bd9e55b4 - Sigstore transparency entry: 1360730489
- Sigstore integration time:
-
Permalink:
omic/attest@ae3b78d57418dfd87b9dfc38f6c63f330f84f6d4 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/omic
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ae3b78d57418dfd87b9dfc38f6c63f330f84f6d4 -
Trigger Event:
release
-
Statement type: