Long-term memory for AI Agents with Azure DefaultAzureCredential authentication and MySQL history database support. Async-only API.
Project description
Mem0 - Azure Enhanced Fork
This repository is an enhanced fork of mem0ai/mem0 that provides enterprise-grade improvements for Azure environments and production deployments.
Key Enhancements
1. Async-Only API
- AsyncMemory: This fork provides only the async
AsyncMemoryclass for better performance and scalability - Simplified Codebase: Removed synchronous
Memoryclass to reduce code complexity and maintenance burden - Modern Python: Built for async/await patterns with full asyncio support
2. Azure Entra ID Authentication
- Azure AI Search: Support for Azure Entra ID (Azure AD) authentication using
DefaultAzureCredential - Azure OpenAI: Seamless Entra ID integration for both LLM and embedding services
- Simplified Authentication: No need to manage API keys when using managed identities or service principals
3. MySQL Database Support
- Production-Ready: Replace SQLite3 with enterprise-grade MySQL for scalable memory history storage
- Connection Pooling: Built-in connection pooling and SSL support for secure connections
- Thread-Safe: Thread-safe operations with proper connection management
Installation
Install the enhanced package with Azure and MySQL dependencies:
pip install mem0ai-azure-mysql
Or with uv:
uv add mem0ai-azure-mysql
Quick Start
import asyncio
from mem0 import AsyncMemory
config = {
"vector_store": {
"provider": "azure_ai_search",
"config": {
"collection_name": "mem0",
"service_name": "your-search-service",
"embedding_model_dims": 1536,
"azure_ad_token": "<your-token>" # Or use DefaultAzureCredential
},
},
"llm": {
"provider": "azure_openai",
"config": {
"model": "gpt-4",
"azure_kwargs": {
"api_version": "2024-12-01-preview",
"azure_deployment": "gpt-4",
"azure_endpoint": "https://your-endpoint.openai.azure.com/",
"azure_ad_token": "<your-token>",
},
},
},
"embedder": {
"provider": "azure_openai",
"config": {
"model": "text-embedding-3-small",
"embedding_dims": 1536,
"azure_kwargs": {
"api_version": "2024-12-01-preview",
"azure_deployment": "text-embedding-3-small",
"azure_endpoint": "https://your-endpoint.openai.azure.com/",
"azure_ad_token": "<your-token>",
},
},
},
"db": {
"provider": "mysql",
"config": {
"host": "your-mysql-server.mysql.database.azure.com",
"port": 3306,
"user": "mem0",
"password": "<your-password>",
"database": "mem0",
"ssl_enabled": True,
},
},
}
async def main():
memory = await AsyncMemory.from_config(config)
# Add memories
result = await memory.add(
"I love playing tennis on weekends",
user_id="user123"
)
print(result)
# Search memories
results = await memory.search("What sports do I like?", user_id="user123")
print(results)
asyncio.run(main())
Development
Setup with uv (Recommended)
# Install dependencies with dev extras
uv sync --extra dev
# Run example
uv run python example.py
# Run tests
uv run pytest tests/
Setup with hatch
# Create environment
make install
# Run tests
make test
Available Commands
make format # Format code with ruff
make lint # Lint code with ruff
make test # Run tests
make build # Build package
make clean # Clean build artifacts
API Reference
AsyncMemory
The main class for interacting with the memory system.
from mem0 import AsyncMemory
# Create from config
memory = await AsyncMemory.from_config(config)
# Add memories
await memory.add(messages, user_id="user123")
# Search memories
await memory.search(query, user_id="user123")
# Get all memories
await memory.get_all(user_id="user123")
# Delete memories
await memory.delete_all(user_id="user123")
# Get memory by ID
await memory.get(memory_id)
# Update memory
await memory.update(memory_id, new_data)
# Get history
await memory.history(memory_id)
# Reset all
await memory.reset()
Probe
An automated availability probe validates Mem0's backing services every 5 minutes via Azure Container App Jobs. It runs 4 sequential test cases (fail-fast):
- init —
AsyncMemory.from_config()(Key Vault, AI Search, MySQL, Neo4j, OpenAI) - add_memory —
memory.add()(LLM, embedder, vector write, history, graph) - search_memory —
memory.search()(embedder, vector search, graph search) - cleanup —
memory.delete_all()(vector/history/graph delete)
Running locally
# Requires Azure credentials (az login)
uv run python probe/probe.py --env test
uv run python probe/probe.py --env prod
Pipeline
The probe pipeline (devops/pipelines/deploy_probe/deploy.yaml) builds the probe image and deploys it to both regions (southeastasia + westus3) as Container App Jobs, with failure and no-data alert rules.
License
This project is licensed under the Apache License 2.0 - see the original mem0 repository for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mem0ai_azure_mysql-0.2.4.tar.gz.
File metadata
- Download URL: mem0ai_azure_mysql-0.2.4.tar.gz
- Upload date:
- Size: 48.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bbbf4b864695995306089c5a64235369d83293e7aeac4382de21512ca77febe7
|
|
| MD5 |
fd469856a6194ad0369baad62f1b7f7b
|
|
| BLAKE2b-256 |
f67f9b6ce9dcf62d9b12dc8d8b9e9773fd90e762ab7454ff6719d97cae37003e
|
File details
Details for the file mem0ai_azure_mysql-0.2.4-py3-none-any.whl.
File metadata
- Download URL: mem0ai_azure_mysql-0.2.4-py3-none-any.whl
- Upload date:
- Size: 61.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d1acba9c2db50b8e6c57f8d53338d40ba6292c1e8fc38a454f2a4fd3e6c64340
|
|
| MD5 |
8d908eb66c67fa416fdcd3bc3a5e6ef4
|
|
| BLAKE2b-256 |
90261e89510c2a66d2b381b72788414df3a66526e1870f0fcdfbfaee99d62ea9
|