Skip to main content

Intuit ECMS Server - Multi-Agent Personal Assistant with Advanced Memory System

Project description

Mirix Logo

MIRIX - Multi-Agent Personal Assistant with an Advanced Memory System

Your personal AI that builds memory through screen observation and natural conversation

| 🌐 Website | 📚 Documentation | 📄 Paper | 💬 Discord


Key Features 🔥

  • Multi-Agent Memory System: Six specialized memory components (Core, Episodic, Semantic, Procedural, Resource, Knowledge Vault) managed by dedicated agents
  • Screen Activity Tracking: Continuous visual data capture and intelligent consolidation into structured memories
  • Privacy-First Design: All long-term data stored locally with user-controlled privacy settings
  • Advanced Search: PostgreSQL-native BM25 full-text search with vector similarity support
  • Multi-Modal Input: Text, images, voice, and screen captures processed seamlessly

Quick Start

End-Users: For end-users who want to build your own memory using MIRIX, please checkout the quick installation guide here.

Developers: For users who want to apply our memory system as the backend, please check out our Backend Usage. Basically, you just need to run:

git clone git@github.com:Mirix-AI/MIRIX.git
cd MIRIX

# Create and activate virtual environment
python -m venv mirix_env
source mirix_env/bin/activate  # On Windows: mirix_env\Scripts\activate

pip install -r requirements.txt

Then you can run the following python code:

from mirix.agent import AgentWrapper

# Initialize agent with configuration
agent = AgentWrapper("./mirix/configs/mirix.yaml")

# Send basic text information
agent.send_message(
    message="The moon now has a president.",
    memorizing=True,
    force_absorb_content=True
)

For more details, please refer to Backend Usage.

Python SDK (NEW!) 🎉

We've created a simple Python SDK that makes it incredibly easy to integrate Mirix's memory capabilities into your applications:

Installationhttps://pypi.org/project/mirix/0.1.5/

pip install mirix

Quick Start with SDK

from mirix import Mirix

# Initialize memory agent (defaults to Google Gemini 2.0 Flash)
memory_agent = Mirix(api_key="your-google-api-key")

# Add memories
memory_agent.add("The moon now has a president")
memory_agent.add("John loves Italian food and is allergic to peanuts")

# Chat with memory context
response = memory_agent.chat("Does the moon have a president?")
print(response)  # "Yes, according to my memory, the moon has a president."

response = memory_agent.chat("What does John like to eat?") 
print(response)  # "John loves Italian food. However, he's allergic to peanuts."

Integration with Claude Agent SDK 🤝

Mirix can be integrated with Anthropic's Claude Agent SDK to give Claude persistent memory across conversations. This allows Claude to remember context, user preferences, and past interactions.

Basic Setup

Here's a simple example of integrating Mirix with the Claude Agent SDK:

#!/usr/bin/env python3
import os
import asyncio
from claude_agent_sdk import query, ClaudeAgentOptions, AssistantMessage
from mirix import Mirix
from collections import deque
from dotenv import load_dotenv
load_dotenv()

# Configuration
MEMORY_UPDATE_INTERVAL = 3  # Update memory every N turns
REINIT_INTERVAL = 1  # Rebuild system prompt (retrieve from Mirix) every N turns
KEEP_LAST_N_TURNS = 50  # Keep last N turns in memory buffer

def build_system_prompt(mirix_agent=None, user_id=None, conversation_buffer=""):
    """Build system prompt with optional Mirix memory context"""
    system_prompt = """You are a helpful assistant."""
    
    # Add Mirix memory context if available
    if mirix_agent and user_id and conversation_buffer:
        memory_context = mirix_agent.extract_memory_for_system_prompt(
            conversation_buffer, user_id
        )
        if memory_context:
            system_prompt += "\n\nRelevant Memory Context:\n" + memory_context
    
    return system_prompt

async def run_agent():
    """Run Claude Agent SDK with Mirix memory integration"""
    
    # Initialize Mirix memory agent
    mirix_agent = Mirix(
        model_name="gemini-2.0-flash",
        api_key=os.getenv("GEMINI_API_KEY"),
    )
    user = mirix_agent.create_user(user_name="Alice")
    
    # Track conversation for memory updates
    conversation_history = deque(maxlen=KEEP_LAST_N_TURNS)
    turn_count = 0
    turns_since_reinit = 0
    session_id = None
    
    while True:
        # Build system prompt with memory context
        options = ClaudeAgentOptions(
            resume=session_id,
            allowed_tools=["Task", "Bash", "Read", "Edit", "Write", "WebSearch"],
            system_prompt=build_system_prompt(
                mirix_agent, user.id, 
                "\n".join([f"User: {u}\nAssistant: {a}" for u, a in conversation_history])
            ),
            model="claude-sonnet-4-5",
            max_turns=50
        )
        
        user_input = input("User: ").strip()
        if user_input.lower() in ['exit', 'quit', 'bye']:
            break
        
        # Get Claude's response
        assistant_response = ""
        async for message in query(prompt=user_input, options=options):
            if hasattr(message, 'subtype') and message.subtype == 'init':
                session_id = message.data.get('session_id')
            
            if isinstance(message, AssistantMessage):
                for block in message.content:
                    if hasattr(block, 'text'):
                        assistant_response += block.text
                        print(block.text, flush=True)
        
        # Update conversation history
        conversation_history.append((user_input, assistant_response))
        turn_count += 1
        turns_since_reinit += 1
        
        # Periodically update Mirix memory
        if turn_count % MEMORY_UPDATE_INTERVAL == 0:
            combined = "\n".join([
                f"[User] {u}\n[Assistant] {a}" 
                for u, a in conversation_history
            ])
            await asyncio.to_thread(mirix_agent.add, combined, user_id=user.id)
            print("✅ Memory updated!")

if __name__ == "__main__":
    asyncio.run(run_agent())

Key Benefits

  • Persistent Memory: Mirix helps Claude remember facts, preferences, and context across sessions
  • Intelligent Retrieval: Mirix automatically retrieves relevant memories for each conversation
  • Scalable: Works with conversations of any length without token limit issues
  • Flexible Updates: Configure how often to update memory (e.g., every N turns)

Example Usage

cd samples

# Install dependencies
pip install mirix claude-agent-sdk python-dotenv

# Set environment variables
export GEMINI_API_KEY="your-google-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"

# Run the agent
python claude_agent.py

License

Mirix is released under the Apache License 2.0. See the LICENSE file for more details.

Contact

For questions, suggestions, or issues, please open an issue on the GitHub repository or contact us at yuwang@mirix.io

Join Our Community

Connect with other Mirix users, share your thoughts, and get support:

💬 Discord Community

Join our Discord server for real-time discussions, support, and community updates: https://discord.gg/S6CeHNrJ

🎯 Weekly Discussion Sessions

We host weekly discussion sessions where you can:

  • Discuss issues and bugs
  • Share ideas about future directions
  • Get general consultations and support
  • Connect with the development team and community

📅 Schedule: Friday nights, 8-9 PM PST
🔗 Zoom Link: https://ucsd.zoom.us/j/96278791276

📱 WeChat Group

WeChat QR Code
WeChat Group

Star History

Star History Chart

Acknowledgement

We would like to thank Letta for open-sourcing their framework, which served as the foundation for the memory system in this project.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jl_ecms_server-0.2.7.tar.gz (1.8 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

jl_ecms_server-0.2.7-py3-none-any.whl (546.8 kB view details)

Uploaded Python 3

File details

Details for the file jl_ecms_server-0.2.7.tar.gz.

File metadata

  • Download URL: jl_ecms_server-0.2.7.tar.gz
  • Upload date:
  • Size: 1.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for jl_ecms_server-0.2.7.tar.gz
Algorithm Hash digest
SHA256 098ce148f5eaf18ddeb67eb27595e9eb9355935e611e884c5a447059b6c2e68a
MD5 7fb756855e303bda49c66ad1201b1b29
BLAKE2b-256 1eb360852a28ddf6f56e4147177d2322ccd4414ba64a9776da6dfcf66c02c073

See more details on using hashes here.

File details

Details for the file jl_ecms_server-0.2.7-py3-none-any.whl.

File metadata

  • Download URL: jl_ecms_server-0.2.7-py3-none-any.whl
  • Upload date:
  • Size: 546.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for jl_ecms_server-0.2.7-py3-none-any.whl
Algorithm Hash digest
SHA256 11ed129fbd725dddd06c2afa4fec46a8bda7fb7375b513a0ced70882b26eeea7
MD5 1044fa3bfe95825c8d4c7c417b9a7ccd
BLAKE2b-256 cc3235f235023f0f752e51529755a9afcb208dd9637490ad8200d71a42edbd1f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page