Skip to main content

File-based persistent memory for AI agents. Zero dependencies.

Project description

Antaris Memory

File-based persistent memory for AI agents. Zero dependencies.

Store, search, decay, and consolidate agent memories using only the Python standard library. No vector databases, no infrastructure, no API keys.

PyPI Tests Python 3.9+ License

What It Does

  • Stores structured facts, decisions, and context as JSON on the local filesystem
  • Retrieval weighted by recency × importance × access frequency (Ebbinghaus-inspired decay)
  • Classifies incoming information by priority (P0–P3) and drops ephemeral content at intake
  • Detects contradictions between stored memories using deterministic rule-based comparison
  • Runs fully offline — zero network calls, zero tokens, zero API keys

What It Doesn't Do

  • Not a vector database — no embeddings (optional embedding support planned)
  • Not a knowledge graph — flat memory store with metadata indexing
  • Not semantic — contradiction detection compares normalized statements using explicit conflict rules, not inference. It will not catch contradictions phrased differently.
  • Not LLM-dependent — all operations are deterministic. No model calls, no prompt engineering.

Design Goals

Goal Rationale
Deterministic Same input → same output. No model variance.
Offline No network, no API keys, no phoning home.
Minimal surface area One class (MemorySystem), obvious method names.
No hidden processes Consolidation and synthesis run only when called.
Transparent storage Plain JSON files. Inspect with any text editor.

Install

pip install antaris-memory

Quick Start

from antaris_memory import MemorySystem

mem = MemorySystem("./workspace", half_life=7.0)
mem.load()  # Load existing state (no-op if first run)

# Store memories
mem.ingest("Decided to use PostgreSQL for the database.",
           source="meeting-notes", category="strategic")
mem.ingest("The API costs $500/month — too expensive.",
           source="review", category="operational")

# Search (results ranked by relevance × decay score)
for r in mem.search("database decision"):
    print(f"[{r.confidence:.1f}] {r.content}")

# Temporal queries
mem.on_date("2026-02-14")
mem.narrative(topic="database migration")

# Selective deletion
mem.forget(entity="John Doe")       # GDPR-ready, with audit trail
mem.forget(before_date="2025-01-01")

# Background consolidation
report = mem.consolidate()
# → duplicates found, topic clusters, contradictions, archive suggestions

mem.save()

Input Gating (P0–P3)

Classify content at intake. Low-value data never enters storage.

mem.ingest_with_gating("CRITICAL: API key compromised", source="alerts")
# → P0 (critical) → stored in strategic tier

mem.ingest_with_gating("Decided to switch to PostgreSQL", source="meeting")
# → P1 (operational) → stored in operational tier

mem.ingest_with_gating("thanks for the update!", source="chat")
# → P3 (ephemeral) → dropped, not stored
Level Category Stored Examples
P0 Strategic Security alerts, errors, deadlines, financial commitments
P1 Operational Decisions, assignments, technical choices
P2 Tactical Background info, research, general discussion
P3 Greetings, acknowledgments, filler

Knowledge Synthesis

Identify gaps in stored knowledge and integrate new research.

# What does the agent not know enough about?
suggestions = mem.research_suggestions(limit=5)
# → [{"topic": "token optimization", "reason": "mentioned 3x, no details", "priority": "P1"}, ...]

# Integrate external findings
report = mem.synthesize(research_results={
    "token optimization": "Context window management techniques..."
})

Memory Decay

Memories fade over time unless reinforced by access:

score = importance × 2^(-age / half_life) + reinforcement
  • Fresh memories score high
  • Unused memories decay toward zero
  • Accessed memories are automatically reinforced
  • Below-threshold memories are candidates for compression

Consolidation

Run periodically to maintain memory health:

report = mem.consolidate()
  • Finds and merges near-duplicate memories
  • Discovers topic clusters
  • Flags contradictions (deterministic, rule-based)
  • Suggests memories for archival

Storage Format

All state is stored in a single memory_metadata.json file:

{
  "version": "0.2.0",
  "saved_at": "2026-02-15T14:30:00",
  "count": 10938,
  "memories": [
    {
      "hash": "a1b2c3d4e5f6",
      "content": "Decided to use PostgreSQL",
      "source": "meeting-notes",
      "category": "strategic",
      "created": "2026-02-15T10:00:00",
      "importance": 1.0,
      "confidence": 0.8,
      "sentiment": {"strategic": 0.6},
      "tags": ["postgresql", "deployment"]
    }
  ]
}

Deletions are logged to memory_audit.json for compliance.

Storage format may evolve between versions. Breaking changes will increment MAJOR version. See CHANGELOG.

Architecture

MemorySystem
├── InputGate          — P0-P3 classification at intake
├── DecayEngine        — Ebbinghaus forgetting curves
├── SentimentTagger    — Rule-based keyword tone tagging
├── TemporalEngine     — Date queries and narrative building
├── ConfidenceEngine   — Reliability scoring
├── CompressionEngine  — Old file summarization
├── ForgettingEngine   — Selective deletion with audit
├── ConsolidationEngine — Dedup, clustering, contradiction detection
└── KnowledgeSynthesizer — Gap identification and research integration

Data flow: ingest → classify (P0-P3) → normalize → persist → search → decay-weight → return

Zero Dependencies

The core package uses only the Python standard library. Optional integrations (LLMs, embeddings) are deliberately excluded to preserve deterministic behavior and eliminate runtime requirements.

Comparison

Antaris Memory LangChain Memory Mem0 Zep
Input gating ✅ P0-P3
Knowledge synthesis
No database required
Memory decay ✅ Ebbinghaus ⚠️ Temporal graphs
Tone tagging ✅ Rule-based keywords ✅ NLP
Temporal queries
Contradiction detection ✅ Rule-based ⚠️ Fact evolution
Selective forgetting ✅ With audit ⚠️ Invalidation ⚠️ Invalidation
Infrastructure needed None Redis/PG Vector + KV + Graph PostgreSQL + Vector

License

Licensed under the Apache License 2.0. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

antaris_memory-0.3.0.tar.gz (36.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

antaris_memory-0.3.0-py3-none-any.whl (32.3 kB view details)

Uploaded Python 3

File details

Details for the file antaris_memory-0.3.0.tar.gz.

File metadata

  • Download URL: antaris_memory-0.3.0.tar.gz
  • Upload date:
  • Size: 36.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for antaris_memory-0.3.0.tar.gz
Algorithm Hash digest
SHA256 4b179c8eaac26d35b75bebb3098329c7346e0d2ab6371455956892be972717ec
MD5 d0956a1137227eb4a7bb99ac46685d2b
BLAKE2b-256 29ec54aea0bf45269b6d4a4b0898a890987b7e3236e72f8df303fd2886325f20

See more details on using hashes here.

File details

Details for the file antaris_memory-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: antaris_memory-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 32.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for antaris_memory-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cad55307eceea80c960c1e7d56d3c5a69309f98c2e6bedf89cde993c256c3767
MD5 0f61136909d68f01184e15811f611a03
BLAKE2b-256 208000096753dbc2c2e5b8c6b3d169a58c8931ba7feb6693544792d64c91de49

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page