Skip to main content

OpenMemory Python SDK provides a local-first long-term memory engine for AI agents and LLM applications. Features include semantic search, multi-sector memory, temporal fact storage, automatic decay, and explainable recall paths. Works fully offline or with the OpenMemory backend.

Project description

OpenMemory Banner

OpenMemory Python SDK

VS Code ExtensionReport BugRequest FeatureDiscord

Local-first long-term memory engine for AI apps and agents. Self-hosted. Explainable. Scalable.

Demo


Quick Start

pip install openmemory-py
from openmemory import OpenMemory

mem = OpenMemory(
    path='./data/memory.sqlite',
    tier='fast',
    embeddings={
        'provider': 'synthetic'  # or 'openai', 'gemini', 'ollama'
    }
)

mem.add("I'm building a Django app with OpenMemory")
results = mem.query("What am I building?")
print(results)

That's it. You're now running a fully local cognitive memory engine 🎉


Features

Local-first - Runs entirely on your machine, zero external dependencies
Multi-sector memory - Episodic, Semantic, Procedural, Emotional, Reflective
Temporal knowledge graph - Time-aware facts with validity periods
Memory decay - Adaptive forgetting with sector-specific rates
Waypoint graph - Associative recall paths for better retrieval
Explainable traces - See exactly why memories were recalled
Zero config - Works out of the box with sensible defaults


Configuration

Required Configuration

All three parameters are required for local mode:

mem = OpenMemory(
    path='./data/memory.sqlite',      # Where to store the database
    tier='fast',                       # Performance tier
    embeddings={
        'provider': 'synthetic'         # Embedding provider
    }
)

Embedding Providers

Synthetic (Testing/Development)

embeddings={'provider': 'synthetic'}

OpenAI (Recommended for Production)

import os

embeddings={
    'provider': 'openai',
    'apiKey': os.getenv('OPENAI_API_KEY'),
    'model': 'text-embedding-3-small'  # optional
}

Gemini

embeddings={
    'provider': 'gemini',
    'apiKey': os.getenv('GEMINI_API_KEY')
}

Ollama (Fully Local)

embeddings={
    'provider': 'ollama',
    'model': 'llama3',
    'ollama': {
        'url': 'http://localhost:11434'  # optional
    }
}

AWS Bedrock

embeddings={
    'provider': 'aws',
    'aws': {
        'accessKeyId': os.getenv('AWS_ACCESS_KEY_ID'),
        'secretAccessKey': os.getenv('AWS_SECRET_ACCESS_KEY'),
        'region': 'us-east-1'
    }
}

Performance Tiers

  • fast - Optimized for speed, lower precision
  • smart - Balanced performance and accuracy
  • deep - Maximum accuracy, slower
  • hybrid - Adaptive based on query complexity

Advanced Configuration

mem = OpenMemory(
    path='./data/memory.sqlite',
    tier='smart',
    embeddings={
        'provider': 'openai',
        'apiKey': os.getenv('OPENAI_API_KEY')
    },
    decay={
        'intervalMinutes': 60,
        'reinforceOnQuery': True,
        'coldThreshold': 0.1
    },
    compression={
        'enabled': True,
        'algorithm': 'semantic',
        'minLength': 100
    },
    reflection={
        'enabled': True,
        'intervalMinutes': 10,
        'minMemories': 5
    }
)

API Reference

add(content, **options)

Store a new memory.

result = mem.add(
    "User prefers dark mode",
    tags=["preference", "ui"],
    metadata={"category": "settings"},
    decayLambda=0.01  # Custom decay rate
)

query(query, **options)

Search for relevant memories.

results = mem.query("user preferences", limit=10, minScore=0.7)

getAll(**options)

Retrieve all memories.

all_memories = mem.getAll(limit=100, offset=0)

getBySector(sector, **options)

Get memories from a specific cognitive sector.

episodic = mem.getBySector('episodic', limit=20)
semantic = mem.getBySector('semantic')

Available sectors: episodic, semantic, procedural, emotional, reflective

delete(id)

Remove a memory by ID.

mem.delete(memory_id)

close()

Close the database connection (important for cleanup).

mem.close()

Cognitive Sectors

OpenMemory automatically classifies content into 5 cognitive sectors:

Sector Description Examples Decay Rate
Episodic Time-bound events & experiences "Yesterday I attended a conference" Medium
Semantic Timeless facts & knowledge "Paris is the capital of France" Very Low
Procedural Skills, procedures, how-tos "To deploy: build, test, push" Low
Emotional Feelings, sentiment, mood "I'm excited about this project!" High
Reflective Meta-cognition, insights "I learn best through practice" Very Low

Examples

Check out the examples/py-sdk/ directory for comprehensive examples:

  • basic_usage.py - CRUD operations
  • advanced_features.py - Decay, compression, reflection
  • brain_sectors.py - Multi-sector demonstration
  • performance_benchmark.py - Performance testing

Remote Mode

For production deployments with a centralized OpenMemory server:

mem = OpenMemory(
    mode='remote',
    url='https://your-backend.com',
    apiKey='your-api-key'
)

Performance

  • 115ms average recall @ 100k memories
  • 338 QPS throughput with 8 workers
  • 95% recall accuracy @ k=5
  • 7.9ms/item scoring at 10k+ scale

Type Hints

Full type hint support included:

from typing import List, Dict, Any
from openmemory import OpenMemory

mem: OpenMemory = OpenMemory(
    path='./data/memory.sqlite',
    tier='fast',
    embeddings={'provider': 'synthetic'}
)

results: List[Dict[str, Any]] = mem.query("test")

License

Apache 2.0


Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openmemory_py-1.2.3.tar.gz (20.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openmemory_py-1.2.3-py3-none-any.whl (19.5 kB view details)

Uploaded Python 3

File details

Details for the file openmemory_py-1.2.3.tar.gz.

File metadata

  • Download URL: openmemory_py-1.2.3.tar.gz
  • Upload date:
  • Size: 20.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for openmemory_py-1.2.3.tar.gz
Algorithm Hash digest
SHA256 39f479689343eb4b6d0799413b6bf5a64ee6ddbcabcb9ca0a30c611e5d7c964f
MD5 82f528198d54154d8a9949a8de2b859d
BLAKE2b-256 3097ea43c516054bd486c89961635c568009f3c43422f19c1c4fc7424a930c75

See more details on using hashes here.

File details

Details for the file openmemory_py-1.2.3-py3-none-any.whl.

File metadata

  • Download URL: openmemory_py-1.2.3-py3-none-any.whl
  • Upload date:
  • Size: 19.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for openmemory_py-1.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 db2059b16dabd5e306e34c04db47c7cd58bc7b5da134d5fe83532bad7b93c0c8
MD5 612aa9f774253754f1e5496455a00dc6
BLAKE2b-256 8fe389147cd12f314675d6ab51a646fd013099a08b11e1fbb0f128e4ef79c3cf

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page