MIRIX Server - Multi-Agent Personal Assistant with Advanced Memory System
Project description
MIRIX - Multi-Agent Personal Assistant with an Advanced Memory System
Your personal AI that builds memory through screen observation and natural conversation
| 🌐 Website | 📚 Documentation | 📄 Paper | 💬 Discord
Key Features 🔥
- Multi-Agent Memory System: Six specialized memory components (Core, Episodic, Semantic, Procedural, Resource, Knowledge Vault) managed by dedicated agents
- Screen Activity Tracking: Continuous visual data capture and intelligent consolidation into structured memories
- Privacy-First Design: All long-term data stored locally with user-controlled privacy settings
- Advanced Search: PostgreSQL-native BM25 full-text search with vector similarity support
- Multi-Modal Input: Text, images, voice, and screen captures processed seamlessly
Quick Start
Step 1: Backend & Dashboard (Docker):
docker compose up -d --pull always
- Dashboard: http://localhost:5173
- API: http://localhost:8531
Step 2: Create an API key in the dashboard (http://localhost:5173) and set as the environmental variable MIRIX_API_KEY.
Step 3: Client (Python, mirix-client, https://pypi.org/project/mirix-client/):
pip install mirix-client
Now you are ready to go! See the example below:
from mirix import MirixClient
client = MirixClient(
api_key="your-api-key",
base_url="http://localhost:8531",
)
client.initialize_meta_agent(
config={
"llm_config": {
"model": "gemini-2.0-flash",
"model_endpoint_type": "google_ai",
"api_key": "your-api-key-here",
"model_endpoint": "https://generativelanguage.googleapis.com",
"context_window": 1_000_000,
},
"embedding_config": {
"embedding_model": "text-embedding-004",
"embedding_endpoint_type": "google_ai",
"api_key": "your-api-key-here",
"embedding_endpoint": "https://generativelanguage.googleapis.com",
"embedding_dim": 768,
},
"meta_agent_config": {
"agents": [
{
"core_memory_agent": {
"blocks": [
{"label": "human", "value": ""},
{"label": "persona", "value": "I am a helpful assistant."},
]
}
},
"resource_memory_agent",
"semantic_memory_agent",
"episodic_memory_agent",
"procedural_memory_agent",
"knowledge_vault_memory_agent",
],
},
}
)
client.add(
user_id="demo-user",
messages=[
{"role": "user", "content": [{"type": "text", "text": "The moon now has a president."}]},
{"role": "assistant", "content": [{"type": "text", "text": "Noted."}]},
],
)
memories = client.retrieve_with_conversation(
user_id="demo-user",
messages=[
{"role": "user", "content": [{"type": "text", "text": "What did we discuss on MirixDB in last 4 days?"}]},
],
limit=5,
)
print(memories)
For more API examples, see samples/run_client.py.
License
Mirix is released under the Apache License 2.0. See the LICENSE file for more details.
Contact
For questions, suggestions, or issues, please open an issue on the GitHub repository or contact us at founders@mirix.io
Join Our Community
Connect with other Mirix users, share your thoughts, and get support:
💬 Discord Community
Join our Discord server for real-time discussions, support, and community updates: https://discord.gg/S6CeHNrJ
🎯 Weekly Discussion Sessions
We host weekly discussion sessions where you can:
- Discuss issues and bugs
- Share ideas about future directions
- Get general consultations and support
- Connect with the development team and community
📅 Schedule: Friday nights, 8-9 PM PST
🔗 Zoom Link: https://ucsd.zoom.us/j/96278791276
📱 WeChat Group
You can add the account ari_asm so that I can add you to the group chat.
Acknowledgement
We would like to thank Letta for open-sourcing their framework, which served as the foundation for the memory system in this project.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file jl_ecms_server-0.48.0.tar.gz.
File metadata
- Download URL: jl_ecms_server-0.48.0.tar.gz
- Upload date:
- Size: 541.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2bbbf3ce1b32f4e48f7a6862ce289d2281bcfb4b97cfd2db0c33c4e958daeb51
|
|
| MD5 |
7cb64108d17479bc38666ed83ffb630c
|
|
| BLAKE2b-256 |
b15b0e60820a45c6c04ba224fb9868cff321b7e64a1050374d69ddf5277c6636
|
File details
Details for the file jl_ecms_server-0.48.0-py3-none-any.whl.
File metadata
- Download URL: jl_ecms_server-0.48.0-py3-none-any.whl
- Upload date:
- Size: 657.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
663a692b3428e4754fac66f9fa8fe31f426d7b5d94679755dfe2fa820aac6811
|
|
| MD5 |
ee95ba9f91d7e8b4076b05bff17d5b9a
|
|
| BLAKE2b-256 |
80560f5a1753d4f227ae0edd3d618915d4ef459ddb41facb23298af087feab5b
|