Skip to main content

AI Memory and Conversation Management Framework - Simple as mem0, Powerful as PersonaLab

Project description

PersonaLab Banner

PersonaLab

๐Ÿง  AI Memory and Conversation Management Framework - Simple as mem0, Powerful as PersonaLab

PyPI version License: MIT Python 3.8+ Code style: black CI Publish

๐ŸŽ‰ PersonaLab v0.1.0 is now available on PyPI! - The first official release with stable PostgreSQL-based memory system and multi-LLM support.

PersonaLab is a comprehensive AI memory and conversation management system that provides intelligent profile management, conversation recording, and advanced semantic search capabilities for AI agents. It combines persistent memory storage, conversation analysis, psychological modeling, and vector-based retrieval for building sophisticated AI applications.

๐Ÿ“ฆ Installation

From PyPI (Recommended)

# Basic installation
pip install personalab

# With AI features (includes OpenAI support)
pip install personalab[ai]

# Full installation (all LLM providers and features)
pip install personalab[all]

From Source (Development)

git clone https://github.com/NevaMind-AI/PersonaLab.git
cd PersonaLab
pip install -e .

# For development
pip install -r requirements-dev.txt
pre-commit install

โšก Quick Start

๐Ÿ’ก Important: All PersonaLab chat interactions require a user_id parameter to identify different users and maintain separate memory spaces for each user.

Simple 3-Line Setup

from personalab import Persona

# Create an AI persona with memory
persona = Persona(agent_id="my_assistant")

# Chat with persistent memory across sessions
response = persona.chat("Hi, I'm learning Python", user_id="student_123")
print(response)

# Memory is automatically managed!

Complete Example with Memory & LLM Configuration

from personalab import Persona
from personalab.llm import OpenAIClient, AnthropicClient

# Configure your LLM client
openai_client = OpenAIClient(api_key="your-key", model="gpt-4")

# Create persona with full features
persona = Persona(
    agent_id="programming_tutor",
    llm_client=openai_client,
    personality="You are a helpful and friendly programming tutor.",
    use_memory=True,   # ๐Ÿง  Long-term memory (facts, preferences, events)
    use_memo=True      # ๐Ÿ’ฌ Conversation history & semantic search
)

# Chat with memory
user_id = "student_123"
response1 = persona.chat("I'm learning machine learning", user_id=user_id)
response2 = persona.chat("What did I mention I was learning?", user_id=user_id)

# End session to update memories
persona.endsession(user_id)

# Get stored memories
memory_info = persona.get_memory(user_id)
print(f"Profile: {memory_info['profile']}")
print(f"Events: {len(memory_info['events'])} stored")

Environment Setup

# 1. Copy environment template (if using from source)
cp .env.example .env

# 2. Add your API keys to .env
echo "OPENAI_API_KEY=your_openai_key_here" >> .env
echo "ANTHROPIC_API_KEY=your_anthropic_key_here" >> .env

# 3. Test configuration
python -c "from personalab import Persona; print('โœ… PersonaLab ready!')"

๐ŸŒŸ Key Features

๐Ÿ’พ Intelligent Memory System

  • ๐Ÿง  Agent Memory: Persistent profile and event storage for AI agents
  • ๐Ÿ‘ค User Memory: Individual memory spaces for different users
  • ๐Ÿ“ Profile Management: Automatic profile updates based on conversations
  • ๐Ÿ“š Event Tracking: Comprehensive conversation and interaction history
  • ๐Ÿง  Theory of Mind: Psychological analysis and behavioral insights

๐Ÿ’ฌ Advanced Conversation Management

  • ๐Ÿ“ Conversation Storage: Structured recording with metadata (user_id, agent_id, timestamps)
  • ๐Ÿ” Vector Embeddings: High-quality semantic embeddings for intelligent search
  • ๐ŸŽฏ Semantic Search: Retrieve relevant conversations based on meaning, not just keywords
  • ๐Ÿ”„ Session Management: Organized conversation tracking and session handling
  • โšก Multiple Providers: OpenAI, SentenceTransformers, and more embedding options

๐Ÿค– Multi-LLM Integration

  • ๐ŸŒ Multiple Providers: OpenAI, Anthropic, Google Gemini, Azure OpenAI, Cohere, AWS Bedrock, Together AI, Replicate
  • ๐Ÿ” Intelligent Search: LLM-powered decision making and content analysis
  • ๐Ÿ“Š Profile Updates: AI-driven profile enhancement from conversation content
  • ๐Ÿ”ง Flexible Configuration: Easy switching between LLM providers and models

๐Ÿ” Advanced Search & Analysis

  • ๐Ÿง  LLM-Enhanced Search: Semantic understanding and relevance scoring
  • โšก Vector Similarity: Fast and accurate conversation retrieval
  • ๐ŸŽฏ Intent Analysis: Intelligent extraction of search requirements
  • ๐Ÿ“Š Context-Aware Results: Ranked results based on conversation context

๐Ÿ“‹ Requirements

  • Python: 3.8 or higher
  • Database: PostgreSQL (required for memory storage)
  • LLM API Keys: OpenAI, Anthropic, or other supported providers

Database Setup

PersonaLab requires PostgreSQL for memory storage. Quick setup:

# Using Docker (recommended)
docker run --name personalab-postgres -e POSTGRES_PASSWORD=your_password -p 5432:5432 -d postgres:14

# Or install PostgreSQL locally
# macOS: brew install postgresql
# Ubuntu: sudo apt-get install postgresql
# Windows: Download from https://www.postgresql.org/download/

๐Ÿ’ก Advanced Usage Examples

Memory & Conversation Integration

from personalab import Persona
from personalab.llm import OpenAIClient, AnthropicClient

# Example 1: Educational Tutor with Memory
tutor = Persona(
    agent_id="math_tutor",
    personality="You are a patient math tutor who tracks student progress.",
    use_memory=True,  # Remember student profiles and learning history
    use_memo=True,    # Search previous conversations for context
    show_retrieval=True  # Show when retrieving relevant past conversations
)

student_id = "student_123"

# First lesson
response1 = tutor.chat("I'm struggling with algebra", user_id=student_id)
response2 = tutor.chat("Can you explain linear equations?", user_id=student_id)

# Later lesson - automatically retrieves relevant context
response3 = tutor.chat("I forgot what we learned about equations", user_id=student_id)

# Update student profile with progress
tutor.endsession(student_id)

# Example 2: Customer Support with Different LLM
support = Persona(
    agent_id="support_agent",
    llm_client=AnthropicClient(api_key="your-key"),  # Using Claude
    personality="You are a helpful customer support specialist.",
    use_memory=True,
    use_memo=True
)

customer_id = "customer_456"
support_response = support.chat("My account is locked", user_id=customer_id)

# Search for similar support tickets
similar_tickets = support.search("account locked", user_id=customer_id, top_k=3)

Direct ConversationManager Usage

from personalab.memo import ConversationManager

# Advanced conversation search and analysis
manager = ConversationManager(
    enable_embeddings=True,
    embedding_provider="openai"  # Use OpenAI embeddings for better quality
)

# Search across all conversations for an agent
results = manager.search_similar_conversations(
    agent_id="support_agent",
    query="billing issues and refunds",
    limit=10,
    similarity_threshold=0.75  # Higher threshold for more relevant results
)

for result in results:
    print(f"Score: {result['similarity_score']:.3f}")
    print(f"User: {result['user_id']}")
    print(f"Summary: {result['summary']}")
    print("---")

๐Ÿ—๏ธ Architecture

Project Structure

PersonaLab/
โ”œโ”€โ”€ personalab/
โ”‚   โ”œโ”€โ”€ __init__.py          # Main exports
โ”‚   โ”œโ”€โ”€ config.py            # Configuration management
โ”‚   โ”œโ”€โ”€ llm.py               # LLM integration
โ”‚   โ”œโ”€โ”€ memory/              # Core memory management module
โ”‚   โ”‚   โ”œโ”€โ”€ __init__.py      # Memory module exports
โ”‚   โ”‚   โ”œโ”€โ”€ base.py          # Core Memory, ProfileMemory, EventMemory, MindMemory
โ”‚   โ”‚   โ”œโ”€โ”€ manager.py       # MemoryManager and conversation processing
โ”‚   โ”‚   โ”œโ”€โ”€ pipeline.py      # MemoryUpdatePipeline and pipeline stages
โ”‚   โ”‚   โ”œโ”€โ”€ storage.py       # MemoryDB and database operations
โ”‚   โ”‚   โ”œโ”€โ”€ events.py        # Event-related utilities
โ”‚   โ”‚   โ””โ”€โ”€ profile.py       # Profile-related utilities
โ”‚   โ””โ”€โ”€ memo/                # Conversation recording and retrieval module
โ”‚       โ”œโ”€โ”€ __init__.py      # Memo module exports
โ”‚       โ”œโ”€โ”€ models.py        # Conversation and Message data models
โ”‚       โ”œโ”€โ”€ storage.py       # ConversationDB and vector storage
โ”‚       โ”œโ”€โ”€ manager.py       # ConversationManager and search functionality
โ”‚       โ””โ”€โ”€ embeddings.py    # Embedding providers and management
โ”œโ”€โ”€ examples/                # Example scripts and usage demos
โ”œโ”€โ”€ docs/                    # Documentation
โ””โ”€โ”€ tests/                   # Test suite

Core Components

Memory Module (personalab.memory)

  • Memory: Unified memory class with ProfileMemory, EventMemory, and MindMemory
  • MemoryManager: Complete memory lifecycle management
  • MemoryUpdatePipeline: Three-stage LLM-driven update process
  • MemoryDB: PostgreSQL-based persistent storage

Memo Module (personalab.memo)

  • ConversationManager: High-level conversation recording and search
  • ConversationDB: Database operations for conversations and vectors
  • Conversation/ConversationMessage: Data models with required fields
  • EmbeddingProviders: OpenAI, SentenceTransformers, auto-selection

Required Fields for Conversations

All conversations must include these mandatory fields:

  • agent_id: Unique identifier for the AI agent (required, non-empty)
  • user_id: Unique identifier for the user (required, non-empty)
  • created_at: Timestamp (automatically set when conversation is created)

Embedding Providers

PersonaLab supports multiple embedding providers with automatic fallback:

  1. OpenAI (Premium): text-embedding-ada-002 (1536 dimensions)
  2. SentenceTransformers (Free): Local models like all-MiniLM-L6-v2 (384 dimensions)
  3. Auto: Automatically selects the best available provider

๐Ÿ”ง Configuration

Environment Variables

# OpenAI (for enhanced embeddings)
export OPENAI_API_KEY="your-openai-api-key"

# Other LLM providers
export ANTHROPIC_API_KEY="your-anthropic-key"
export GOOGLE_AI_API_KEY="your-google-key"

Embedding Provider Configuration

# Use specific embedding provider
manager = ConversationManager(
    embedding_provider="openai"  # or "sentence-transformers", "auto"
)

# Disable embeddings entirely
manager = ConversationManager(enable_embeddings=False)

Memory Configuration

# Custom persona setup with specific LLM configuration
from personalab.llm import OpenAIClient

custom_llm = OpenAIClient(
    api_key="your-key",
    model="gpt-4",
    temperature=0.3,
    max_tokens=2000
)

persona = Persona(
    agent_id="custom_assistant",
    llm_client=custom_llm,
    use_memory=True,
    use_memo=True,
    show_retrieval=False
)

Search Parameters

# Configure semantic search using Persona
persona = Persona(agent_id="assistant")
user_id = "user_123"

# Search with parameters
results = persona.search(
    query="machine learning help",
    user_id=user_id,
    top_k=10                     # Maximum results
)

# Or using ConversationManager directly for more control
manager = ConversationManager()
results = manager.search_similar_conversations(
    agent_id="assistant",
    query="machine learning help",
    limit=10,                    # Maximum results
    similarity_threshold=0.7     # Minimum similarity score (0.0-1.0)
)

๐Ÿ“š Examples

The examples/ directory contains comprehensive usage examples:

๐Ÿš€ Try the Examples

# Clone the repository to access examples
git clone https://github.com/NevaMind-AI/PersonaLab.git
cd PersonaLab

# Set up environment
cp .env.example .env  # Add your API keys
pip install -e .

# Run examples
python examples/quick_start.py
python examples/memo_simple_example.py

๐Ÿ” Use Cases

Customer Support

# Create support persona
support_persona = Persona(
    agent_id="support_bot",
    personality="You are a helpful customer support agent.",
    use_memory=True,
    use_memo=True
)

customer_id = "customer_456"

# Handle customer inquiry (automatically records and retrieves context)
response = support_persona.chat("I'm having login problems", user_id=customer_id)

# Find similar past issues
similar_issues = support_persona.search("login problems", user_id=customer_id, top_k=5)

# End session to update customer profile
support_persona.endsession(customer_id)

Educational Assistants

# Create tutor persona
tutor_persona = Persona(
    agent_id="tutor_bot",
    personality="You are a patient and encouraging math tutor.",
    use_memory=True,
    use_memo=True
)

student_id = "student_789"

# Tutoring session (automatically tracks learning progress)
response1 = tutor_persona.chat("I need help with algebra word problems", user_id=student_id)
response2 = tutor_persona.chat("Can you give me another example?", user_id=student_id)

# Retrieve related learning materials from past sessions
related_topics = tutor_persona.search("algebra word problems", user_id=student_id, top_k=5)

# End session to update learning profile
result = tutor_persona.endsession(student_id)
print(f"Learning progress updated: {result}")

Personal AI Assistants

# Create personal assistant
personal_assistant = Persona(
    agent_id="personal_ai",
    personality="You are a thoughtful personal assistant who remembers important details.",
    use_memory=True,
    use_memo=True
)

user_id = "user_personal"

# Daily conversation with memory
with personal_assistant.session(user_id):
    response1 = personal_assistant.chat("I'm planning a vacation to Japan", user_id=user_id)
    response2 = personal_assistant.chat("What should I pack?", user_id=user_id)
    # Session automatically ends and updates memory

# Later conversation - retrieves context automatically
response3 = personal_assistant.chat("What were those vacation plans I mentioned?", user_id=user_id)

# Manual context retrieval if needed
context = personal_assistant.search("vacation plans", user_id=user_id, top_k=3)

๐Ÿงช Testing

# Run all tests
python -m pytest tests/

# Run specific test files
python -m pytest tests/test_memory.py
python -m pytest tests/test_memo.py

# Run with coverage
python -m pytest --cov=personalab tests/

๐Ÿ“– Documentation

For detailed documentation, see the docs/ directory:

๐Ÿค Contributing

We welcome contributions! Please see our Contributing Guidelines for details.

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • OpenAI for providing excellent embedding models
  • SentenceTransformers team for open-source embedding solutions
  • Contributors and the AI/ML community for inspiration and feedback

๐Ÿ“ž Support


๐Ÿ“‹ What's New in v0.1.0

๐ŸŽ‰ First Official Release! PersonaLab v0.1.0 brings stable, production-ready AI memory management:

โœจ Key Features

  • ๐Ÿ—„๏ธ PostgreSQL-Only Architecture: Removed all SQLite dependencies for production reliability
  • ๐Ÿง  Enhanced Memory System: Improved profile updates and event tracking
  • ๐Ÿ’ฌ Advanced Conversation Search: Semantic search with multiple embedding providers
  • ๐Ÿค– Multi-LLM Support: OpenAI, Anthropic, Google Gemini, and 8+ other providers
  • ๐Ÿ“ฆ PyPI Package: Easy installation with pip install personalab
  • ๐Ÿ” Better Documentation: Comprehensive examples and usage guides
  • โšก Performance Optimizations: Faster memory updates and conversation retrieval

๐Ÿ› ๏ธ Technical Improvements

  • Python 3.8+ Compatibility: Tested across Python 3.8-3.12
  • Automated CI/CD: GitHub Actions for testing and PyPI publishing
  • Code Quality: Black, isort, flake8, mypy formatting standards
  • Comprehensive Testing: Full test suite with PostgreSQL integration

๐Ÿš€ Migration from Pre-release

If you're upgrading from development versions:

# Remove old development installation
pip uninstall personalab

# Install official release
pip install personalab[all]

๐Ÿ“… Release History

  • v0.1.0 (Current) - First official PyPI release with PostgreSQL-only architecture
  • Pre-release - Development versions with SQLite support (deprecated)

๐Ÿ”— Links


PersonaLab - Building the memory foundation for next-generation AI agents ๐Ÿง โœจ

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

personalab-0.1.2.tar.gz (56.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

personalab-0.1.2-py3-none-any.whl (55.7 kB view details)

Uploaded Python 3

File details

Details for the file personalab-0.1.2.tar.gz.

File metadata

  • Download URL: personalab-0.1.2.tar.gz
  • Upload date:
  • Size: 56.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for personalab-0.1.2.tar.gz
Algorithm Hash digest
SHA256 94778f44d9aa8aa7b12f7f8e8a60a25e3d47022398cee7049d0a0a0cd95cc9a0
MD5 8e6881facff58e81c87220a3b07cde30
BLAKE2b-256 931108ce89d50913a95044f6e57ffff513ca94ecf8399a092c7e20875dd49290

See more details on using hashes here.

Provenance

The following attestation bundles were made for personalab-0.1.2.tar.gz:

Publisher: publish.yml on NevaMind-AI/PersonaLab

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file personalab-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: personalab-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 55.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for personalab-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 1f4d933d115c7d4725bf6e68c8691405503962078ef7f982557f8b283244770d
MD5 af67ea270cb364fda2e16d4d2d0b60a8
BLAKE2b-256 6d0ae32ed4a97187769abdd0d2462077c83504f8fff0e96f36e5ee041c46a58a

See more details on using hashes here.

Provenance

The following attestation bundles were made for personalab-0.1.2-py3-none-any.whl:

Publisher: publish.yml on NevaMind-AI/PersonaLab

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page