Skip to main content

Local-first AI memory system for robotics, drones, and edge AI - 100% offline capable

Project description

Shodh-Memory

Persistent memory for AI agents. Single package. Local-first. Runs offline.

PyPI Downloads License


Give your AI agents memory that persists across sessions, learns from experience, and runs entirely on your hardware.

Installation

pip install shodh-memory

That's it. No additional setup required. Models and runtime are bundled.

Quick Start

from shodh_memory import Memory

# Create memory (data stored locally)
memory = Memory(storage_path="./my_agent_data")

# Store memories
memory.remember("User prefers dark mode", memory_type="Decision")
memory.remember("JWT tokens expire after 24h", memory_type="Learning")
memory.remember("Deployment failed due to missing env var", memory_type="Error")

# Search semantically
results = memory.recall("user preferences", limit=5)
for r in results:
    print(f"{r['content']} (importance: {r['importance']:.2f})")

# Get context summary for LLM bootstrap
summary = memory.context_summary()
print(summary["decisions"])  # Recent decisions
print(summary["learnings"])  # Recent learnings

Features

  • Zero setup — Everything bundled. No API keys, no cloud, no Docker
  • Semantic search — MiniLM embeddings for meaning-based retrieval
  • Hebbian learning — Connections strengthen when memories are used together
  • Activation decay — Unused memories fade naturally
  • Idempotent — Content-hash dedup prevents duplicate memories
  • Entity extraction — TinyBERT NER extracts people, orgs, locations
  • 100% offline — Works on air-gapped systems

Memory Types

Different types get different importance weights:

Type Weight Use for
Decision +0.30 Choices, preferences, conclusions
Learning +0.25 New knowledge acquired
Error +0.25 Mistakes to avoid
Discovery +0.20 Findings, insights
Pattern +0.20 Recurring behaviors
Task +0.15 Work items
Context +0.10 General information
Conversation +0.10 Chat history
Observation +0.05 Low-priority notes

API Reference

Core Memory

# Store a memory
memory.remember(
    content="...",           # Required: the memory content
    memory_type="Learning",  # Optional: Decision, Learning, Error, etc.
    tags=["tag1", "tag2"],   # Optional: for filtering
    metadata={"key": "val"}  # Optional: custom metadata dict
)

# Semantic search
results = memory.recall(
    query="...",             # Required: search query
    limit=10,                # Optional: max results (default: 10)
    mode="hybrid"            # Optional: semantic, associative, hybrid
)

# Search by tags (no embedding needed, fast)
results = memory.recall_by_tags(tags=["preferences", "ui"], limit=20)

# Search by date range
results = memory.recall_by_date(
    start="2025-12-01T00:00:00Z",
    end="2025-12-20T23:59:59Z",
    limit=20
)

# List all memories
memories = memory.list_memories(limit=100, memory_type="Decision")

# Get single memory by ID
mem = memory.get_memory("uuid-here")

# Get statistics
stats = memory.get_stats()
print(f"Total: {stats['total_memories']}")

Proactive Context (for Agent Loops)

# Surface relevant memories for current context
# Use in every agent loop to maintain context awareness
result = memory.proactive_context(
    context="User asking about authentication",  # Current conversation/task
    semantic_threshold=0.65,                      # Min similarity (0.0-1.0)
    max_results=5,                                # Max memories to return
    auto_ingest=True,                             # Store context as Conversation memory
    recency_weight=0.2                            # Boost recent memories
)

# Returns surfaced memories with relevance scores
for mem in result["memories"]:
    print(f"{mem['content'][:50]} (score: {mem['relevance_score']:.2f})")

Forget Operations

# Delete by ID
memory.forget("memory-uuid")

# Delete old memories
memory.forget_by_age(days=30)

# Delete low-importance memories
memory.forget_by_importance(threshold=0.3)

# Delete by pattern (regex)
memory.forget_by_pattern(r"test.*")

# Delete by tags
memory.forget_by_tags(["temporary", "draft"])

# Delete by date range (ISO 8601 format)
memory.forget_by_date(start="2025-11-01T00:00:00Z", end="2025-11-30T23:59:59Z")

# GDPR: Delete everything
memory.forget_all()

Context & Introspection

# Context summary for LLM bootstrap
summary = memory.context_summary(max_items=5)
# Returns: {"decisions": [...], "learnings": [...], "context": [...], "patterns": [...]}

# 3-tier memory visualization
state = memory.brain_state(longterm_limit=100)
# Returns: {"working_memory": [...], "session_memory": [...], "longterm_memory": [...], "stats": {...}}

# Memory learning activity report
report = memory.consolidation_report(since="2025-12-19T00:00:00Z")
# Returns: strengthening events, decay events, edge formations, pruned associations

# Raw consolidation events
events = memory.consolidation_events(since="2025-12-19T00:00:00Z")

# Knowledge graph statistics
graph = memory.graph_stats()
print(f"Nodes: {graph['node_count']}, Edges: {graph['edge_count']}")

# Flush to disk
memory.flush()

Index Health & Maintenance

# Verify vector index integrity
report = memory.verify_index()
print(f"Healthy: {report['is_healthy']}, Orphaned: {report['orphaned_count']}")

# Repair orphaned memories (re-index missing entries)
result = memory.repair_index()
print(f"Repaired: {result['repaired']}, Failed: {result['failed']}")

# Get detailed index health metrics
health = memory.index_health()
print(f"Vectors: {health['total_vectors']}, Needs rebuild: {health['needs_rebuild']}")

LLM Framework Integration

LangChain

from shodh_memory.integrations.langchain import ShodhMemory

# Use as LangChain memory
memory = ShodhMemory(storage_path="./langchain_data")

LlamaIndex

from shodh_memory.integrations.llamaindex import ShodhLlamaMemory

# Use as LlamaIndex memory
memory = ShodhLlamaMemory(storage_path="./llamaindex_data")

Performance

Measured on Intel i7-1355U (10 cores, 1.7GHz):

Operation Latency
remember() 55-60ms
recall() (semantic) 34-58ms
recall_by_tags() ~1ms
list_memories() ~1ms

Architecture

Experiences flow through three tiers based on Cowan's working memory model:

Working Memory ──overflow──> Session Memory ──importance──> Long-Term Memory
   (100 items)                  (500 MB)                      (RocksDB)

Cognitive processing:

  • Spreading activation retrieval
  • Activation decay (exponential)
  • Hebbian strengthening (co-retrieval strengthens connections)
  • Long-term potentiation (frequently-used connections become permanent)
  • Memory replay during maintenance
  • Interference detection

Platform Support

Platform Status
Windows x86_64 Supported
Linux x86_64 Supported
macOS ARM64 (Apple Silicon) Supported
macOS x86_64 (Intel) Supported

Links

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

shodh_memory-0.2.0.tar.gz (1.1 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

shodh_memory-0.2.0-cp38-abi3-win_amd64.whl (41.0 MB view details)

Uploaded CPython 3.8+Windows x86-64

shodh_memory-0.2.0-cp38-abi3-manylinux_2_28_x86_64.whl (45.0 MB view details)

Uploaded CPython 3.8+manylinux: glibc 2.28+ x86-64

shodh_memory-0.2.0-cp38-abi3-macosx_11_0_arm64.whl (44.7 MB view details)

Uploaded CPython 3.8+macOS 11.0+ ARM64

shodh_memory-0.2.0-cp38-abi3-macosx_10_13_x86_64.whl (47.1 MB view details)

Uploaded CPython 3.8+macOS 10.13+ x86-64

File details

Details for the file shodh_memory-0.2.0.tar.gz.

File metadata

  • Download URL: shodh_memory-0.2.0.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for shodh_memory-0.2.0.tar.gz
Algorithm Hash digest
SHA256 5c777f3129d06de9873a9f1848cede6826c599f35fab11ea9c97b927666b5e45
MD5 159cc91f0288070635b9d0a49f2b87e0
BLAKE2b-256 3168b514f7096508fce8dee7d4e1aa0f0ee1c4b402bf2c042510f072a2f62c99

See more details on using hashes here.

File details

Details for the file shodh_memory-0.2.0-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: shodh_memory-0.2.0-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 41.0 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for shodh_memory-0.2.0-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 28988ebcbaca26119163ff4798bee0da73bdc58ee768e481b47e16bfa7194e94
MD5 83fbc0f90cbb1da964b83cd00266819b
BLAKE2b-256 f063150e945c972ae4cbb592ab47edc45a4581020e5063bdb5080076cda957c2

See more details on using hashes here.

File details

Details for the file shodh_memory-0.2.0-cp38-abi3-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for shodh_memory-0.2.0-cp38-abi3-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 76640f23271b72a3181d2f83799fed75ab8dbd62cbb9f2a43164990a2a333612
MD5 7b0089ac4e2f01e2647a4c6f78802194
BLAKE2b-256 082a1a99845338609aa03ca88eeb1ee56d7664950c5ee77ead4b4e56bdfa55f8

See more details on using hashes here.

File details

Details for the file shodh_memory-0.2.0-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shodh_memory-0.2.0-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1c6a6dd9a457cec4c48ca46e337f2bd25b6d8b96c93873e59b4ece5bf64e225a
MD5 b0e0c9efb0a41edbc6d87368a8af2911
BLAKE2b-256 cbafd69aebc0001cd38724ec30c6894e1ff698580c35f54514cd641c32668bff

See more details on using hashes here.

File details

Details for the file shodh_memory-0.2.0-cp38-abi3-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for shodh_memory-0.2.0-cp38-abi3-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 d7722b69692d3fbd7ebf23d1323759fe73b12dbf58be1749b901ecd726de7532
MD5 0426d781f063a4dd3cbe5829130f1e1e
BLAKE2b-256 55bcac7a8fc864406d4ee85b4e5b373ca4620f2fec619d7215dedd0f7b2db4ad

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page