Give your AI agents persistent, searchable long-term memory with pluggable storage backends
Project description
🧠 LLM Long-Term Memory
Give your AI agents persistent, searchable long-term memory with pluggable storage backends.
A sophisticated memory storage and retrieval system that provides LLMs with persistent, searchable long-term memory capabilities. This system can extract, store, update, and retrieve memories from conversations, enabling AI agents to maintain context across multiple sessions.
✨ Features
- 🧠 Intelligent Memory Extraction - Automatically extracts factual information from conversations using OpenAI GPT
- 🔍 Semantic Search - Vector-based similarity search using OpenAI embeddings and FAISS
- 💾 Pluggable Storage Backends - SQLite, PostgreSQL, MongoDB, and Redis support
- 🔄 Memory Management - Add, update, and delete memories with conflict resolution
- 📊 Category Organization - Automatic categorization of memories
- ⚡ Importance Scoring - Weighted importance system for memory prioritization
- 🔗 LangChain Integration - Built with LangChain for robust LLM interactions
📦 Installation
# Basic installation (SQLite backend)
pip install llm-long-term-memory
# With PostgreSQL support
pip install llm-long-term-memory[postgresql]
# With MongoDB support
pip install llm-long-term-memory[mongodb]
# With Redis support
pip install llm-long-term-memory[redis]
# With all backends
pip install llm-long-term-memory[all]
# With Streamlit UI
pip install llm-long-term-memory[streamlit]
🚀 Quick Start
from llm_memory import LongTermMemorySystem
# Initialize with SQLite (default)
memory = LongTermMemorySystem(openai_api_key="your-api-key")
# Process a message and extract memories
result = memory.process_message(
"I use VS Code for Python development and prefer dark mode",
user_id="user123"
)
print(f"Extracted {len(result['new_memories'])} memories")
# Query memories
answer = memory.answer_with_memory("What IDE do I use?")
print(answer) # "You use VS Code for Python development"
# Get all memories
memories = memory.get_all_memories()
for mem in memories:
print(f"- {mem.content} (importance: {mem.importance})")
💾 Storage Backends
SQLite (Default)
from llm_memory import LongTermMemorySystem
memory = LongTermMemorySystem(
openai_api_key="...",
storage_backend="sqlite",
storage_config={"db_path": "my_memories.db"}
)
PostgreSQL
from llm_memory import LongTermMemorySystem
memory = LongTermMemorySystem(
openai_api_key="...",
storage_backend="postgresql",
storage_config={
"connection_string": "postgresql://user:password@localhost:5432/memory_db"
}
)
MongoDB
from llm_memory import LongTermMemorySystem
memory = LongTermMemorySystem(
openai_api_key="...",
storage_backend="mongodb",
storage_config={
"connection_string": "mongodb://localhost:27017",
"database": "memory_db",
"collection": "memories"
}
)
Redis
from llm_memory import LongTermMemorySystem
memory = LongTermMemorySystem(
openai_api_key="...",
storage_backend="redis",
storage_config={
"host": "localhost",
"port": 6379,
"password": "optional_password"
}
)
Custom Backend
from llm_memory import StorageBackend, Memory, LongTermMemorySystem
class MyCustomBackend(StorageBackend):
def init_storage(self) -> None:
# Initialize your storage
pass
def save_memory(self, memory: Memory) -> None:
# Save memory
pass
def get_memory(self, memory_id: str) -> Memory:
# Get memory by ID
pass
def get_all_memories(self) -> list:
# Get all memories
pass
def delete_memory(self, memory_id: str) -> bool:
# Delete memory
pass
def search_memories(self, query: str, category: str = None) -> list:
# Search memories
pass
def close(self) -> None:
# Close connections
pass
# Use your custom backend
memory = LongTermMemorySystem(
openai_api_key="...",
storage_backend=MyCustomBackend()
)
📊 Memory Structure
from llm_memory import Memory
# Each memory contains:
memory = Memory(
id="unique_id",
content="User prefers dark mode",
category="preferences",
importance=0.8,
timestamp="2024-01-15T10:30:00",
embedding=[...], # Vector embedding
metadata={"user_id": "user123", "source": "chat"}
)
🔧 API Reference
LongTermMemorySystem
# Initialize
memory = LongTermMemorySystem(
openai_api_key: str, # Required: OpenAI API key
storage_backend: str = "sqlite", # Backend type or instance
storage_config: dict = None, # Backend-specific config
embedding_model: str = "text-embedding-3-small",
llm_model: str = "gpt-3.5-turbo",
)
# Methods
memory.process_message(message, user_id, context) # Extract memories from message
memory.query_memories(query, k=5) # Semantic search for memories
memory.answer_with_memory(question, max_memories) # Answer using memory context
memory.get_all_memories() # Get all stored memories
memory.delete_memory(memory_id) # Delete specific memory
memory.get_memory_stats() # Get statistics
memory.close() # Close connections
Context Manager Support
from llm_memory import LongTermMemorySystem
with LongTermMemorySystem(openai_api_key="...") as memory:
memory.process_message("Hello!", user_id="user123")
# Connections automatically closed when done
🎯 Use Cases
- Personal AI Assistants - Remember user preferences, habits, and information
- Customer Service Bots - Maintain customer history and preferences
- Educational AI - Track learning progress and personalized content
- Productivity Tools - Remember user workflows and tool preferences
- Healthcare AI - Maintain patient information (with proper security)
🌐 Web Interface
If you installed with [streamlit]:
# Clone the repo for the app.py
git clone https://github.com/Devparihar5/llm-long-term-memory.git
cd llm-long-term-memory
# Run the Streamlit app
streamlit run app.py
🔒 Security Considerations
- Store API keys securely (use environment variables)
- Use secure connection strings for databases
- Consider encryption for sensitive memories
- Implement user authentication for multi-user scenarios
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
Made with ❤️ by Devendra Parihar
Contributions by Divya ✨
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_long_term_memory-0.1.0.tar.gz.
File metadata
- Download URL: llm_long_term_memory-0.1.0.tar.gz
- Upload date:
- Size: 13.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c4e08ec8c844a73ef34c25f74153e45dc3c9dda1c4e743fb5f97d9162463b5ea
|
|
| MD5 |
28782a2fe159a92a6f171212cb2d23a9
|
|
| BLAKE2b-256 |
2781d689be400e65cd3ee284d41e7bfafb365d9517ec8cdecff580cbd7f3f51f
|
File details
Details for the file llm_long_term_memory-0.1.0-py3-none-any.whl.
File metadata
- Download URL: llm_long_term_memory-0.1.0-py3-none-any.whl
- Upload date:
- Size: 14.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b096b1099029a02d85d42cd389b47ecdc2e19a70521a7c562e01f7f51b250483
|
|
| MD5 |
e66657b7cb2119c80ef870a48c0ff200
|
|
| BLAKE2b-256 |
5d4c3f532245267526e888111c3592678cd1ba49041beff0c0127816c7f6e4ae
|