Skip to main content

Persistent, queryable memory for AI coding agents

Project description

Enyal

Persistent, queryable memory for AI coding agents.

Enyal gives AI agents like Claude Code durable context that survives session restarts. Every conversation becomes accumulated institutional knowledge—facts, preferences, decisions, and conventions that persist and grow.

Features

  • Persistent Memory: Context survives restarts, crashes, and process termination
  • Semantic Search: Find relevant context using natural language queries (384-dim embeddings via all-MiniLM-L6-v2)
  • Hierarchical Scoping: Global → workspace → project → file context inheritance
  • Fully Offline: Zero network calls during operation
  • Cross-Platform: macOS (Intel + Apple Silicon), Linux, and Windows
  • MCP Compatible: Works with Claude Code, Cursor, Windsurf, Kiro, and any MCP client

Quick Start

Get up and running in under 2 minutes:

1. Install

# Using uvx (recommended - no installation needed)
uvx enyal serve

# Or install with pip
pip install enyal

2. Configure Your MCP Client

Universal configuration (works with Claude Code, Cursor, Windsurf, Kiro):

{
  "mcpServers": {
    "enyal": {
      "command": "uvx",
      "args": ["enyal", "serve"]
    }
  }
}

For macOS Intel users (requires Python 3.11 or 3.12):

{
  "mcpServers": {
    "enyal": {
      "command": "uvx",
      "args": ["--python", "3.12", "enyal", "serve"]
    }
  }
}

3. Start Using

You: Remember that this project uses pytest for all testing
Assistant: [calls enyal_remember] Stored context about testing framework

You: What testing framework should I use?
Assistant: [calls enyal_recall] Based on stored context, this project uses pytest.

Platform Support

Platform Python 3.11 Python 3.12 Python 3.13
macOS Apple Silicon uvx enyal serve uvx enyal serve uvx enyal serve
macOS Intel uvx --python 3.11 enyal serve uvx --python 3.12 enyal serve Not supported*
Linux uvx enyal serve uvx enyal serve uvx enyal serve
Windows uvx enyal serve uvx enyal serve uvx enyal serve

*macOS Intel + Python 3.13 is not supported due to PyTorch ecosystem constraints.

Installation Methods

Method 1: uvx (Recommended for MCP)

# Most platforms (auto-selects Python)
uvx enyal serve

# macOS Intel (explicit Python version)
uvx --python 3.12 enyal serve

# With model preloading for faster first query
uvx enyal serve --preload

Method 2: pipx

# Install globally
pipx install enyal

# Run server
enyal serve

Method 3: pip

# Using uv (recommended)
uv add enyal

# Using pip
pip install enyal

# Run server
enyal serve

MCP Integration

Enyal works with any MCP-compatible client. The configuration is the same across platforms—only the command may vary for macOS Intel.

Claude Code

File locations:

  • Project: .mcp.json (in project root)
  • User: ~/.claude/.mcp.json

Standard configuration:

{
  "mcpServers": {
    "enyal": {
      "command": "uvx",
      "args": ["enyal", "serve"],
      "env": {
        "ENYAL_DB_PATH": "~/.enyal/context.db"
      }
    }
  }
}

macOS Intel configuration:

{
  "mcpServers": {
    "enyal": {
      "command": "uvx",
      "args": ["--python", "3.12", "enyal", "serve"],
      "env": {
        "ENYAL_DB_PATH": "~/.enyal/context.db"
      }
    }
  }
}

CLI setup:

# Standard
claude mcp add-json enyal '{"command":"uvx","args":["enyal","serve"]}'

# macOS Intel
claude mcp add-json enyal '{"command":"uvx","args":["--python","3.12","enyal","serve"]}'

Claude Desktop

File locations:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Configuration:

{
  "mcpServers": {
    "enyal": {
      "command": "uvx",
      "args": ["enyal", "serve"],
      "env": {
        "ENYAL_DB_PATH": "~/.enyal/context.db"
      }
    }
  }
}

Cursor

File locations:

  • Global: ~/.cursor/mcp.json
  • Project: .cursor/mcp.json

Configuration:

{
  "mcpServers": {
    "enyal": {
      "command": "uvx",
      "args": ["enyal", "serve"],
      "env": {
        "ENYAL_DB_PATH": "~/.enyal/context.db"
      }
    }
  }
}

UI setup: File → Preferences → Cursor Settings → MCP

Windsurf

File location: ~/.codeium/windsurf/mcp_config.json

Configuration:

{
  "mcpServers": {
    "enyal": {
      "command": "uvx",
      "args": ["enyal", "serve"],
      "env": {
        "ENYAL_DB_PATH": "~/.enyal/context.db"
      }
    }
  }
}

UI setup: Windsurf Settings → Cascade → MCP, or use the Plugin Store

Kiro

File locations:

  • Global: ~/.kiro/settings/mcp.json
  • Project: .kiro/settings/mcp.json

Configuration:

{
  "mcpServers": {
    "enyal": {
      "command": "uvx",
      "args": ["enyal", "serve"],
      "env": {
        "ENYAL_DB_PATH": "~/.enyal/context.db"
      },
      "autoApprove": ["enyal_recall", "enyal_stats", "enyal_get"]
    }
  }
}

UI setup: Click the Kiro ghost tab → MCP Servers → "+"

See docs/INTEGRATIONS.md for detailed platform-specific guides.

Available Tools

Tool Description
enyal_remember Store new context with metadata (facts, preferences, decisions, conventions, patterns)
enyal_recall Semantic search for relevant context with filtering by scope and type
enyal_forget Remove or deprecate context (soft-delete by default, hard-delete optional)
enyal_update Update existing entries (content, confidence, tags)
enyal_get Retrieve a specific entry by ID with full metadata
enyal_stats Get usage statistics and health metrics

Content Types

Type Use For Example
fact Objective information "The database uses PostgreSQL 15"
preference User/team preferences "Prefer tabs over spaces"
decision Recorded decisions "Chose React over Vue for frontend"
convention Coding standards "All API endpoints follow REST naming"
pattern Code patterns "Error handling uses Result<T, E> pattern"

Scope Levels

Scope Applies To Example Path
global All projects (none)
workspace Directory of projects /Users/dev/projects
project Single project /Users/dev/myproject
file Specific file /Users/dev/myproject/src/auth.py

CLI Usage

Enyal provides a command-line interface for direct interaction:

# Store context
enyal remember "Always use pytest for testing" --type convention --scope project

# Search context
enyal recall "testing framework" --limit 5

# Get entry details
enyal get <entry-id>

# View statistics
enyal stats

# Remove context
enyal forget <entry-id>

# Run MCP server
enyal serve --preload

Options:

  • --db PATH — Custom database path
  • --json — Output in JSON format

See docs/CLI.md for complete CLI reference.

Python Library

from enyal.core.store import ContextStore
from enyal.core.retrieval import RetrievalEngine
from enyal.models.context import ContextType, ScopeLevel

# Initialize store
store = ContextStore("~/.enyal/context.db")
retrieval = RetrievalEngine(store)

# Remember something
entry_id = store.remember(
    content="Always use pytest for testing in this project",
    content_type=ContextType.CONVENTION,
    scope_level=ScopeLevel.PROJECT,
    scope_path="/Users/dev/myproject",
    tags=["testing", "pytest"]
)

# Recall relevant context
results = retrieval.search(
    query="how should I write tests?",
    limit=5,
    min_confidence=0.5
)

for result in results:
    print(f"{result.score:.2f}: {result.entry.content}")

# Update context
store.update(entry_id, confidence=0.9, tags=["testing", "pytest", "unit-tests"])

# Get specific entry
entry = store.get(entry_id)

# Get statistics
stats = store.stats()
print(f"Total entries: {stats.total_entries}")

Configuration

Environment Variables

Variable Default Description
ENYAL_DB_PATH ~/.enyal/context.db Database file location
ENYAL_PRELOAD_MODEL false Pre-load embedding model at startup
ENYAL_LOG_LEVEL INFO Logging level (DEBUG, INFO, WARNING, ERROR)

Database Location

The default database is stored at ~/.enyal/context.db. This single SQLite file contains:

  • All context entries and metadata
  • Vector embeddings for semantic search
  • Full-text search index

Troubleshooting

Installation Fails on macOS Intel

Symptom: Error about torch/PyTorch wheels not found

Cause: PyTorch doesn't provide wheels for macOS Intel + Python 3.13

Solution: Use Python 3.11 or 3.12:

uvx --python 3.12 enyal serve

MCP Server Not Connecting

  1. Check uvx is installed:

    uvx --version
    
  2. Test server manually:

    uvx enyal serve
    # Should start without errors, waiting for MCP protocol
    
  3. Enable debug logging:

    {
      "mcpServers": {
        "enyal": {
          "command": "uvx",
          "args": ["enyal", "serve", "--log-level", "DEBUG"]
        }
      }
    }
    
  4. Check server status:

    • Claude Code: /mcp command
    • Cursor: Settings → MCP → check status
    • Windsurf: Cascade → Plugins
    • Kiro: Ghost tab → MCP Servers

Slow First Query

The first query loads the embedding model (~80MB). This takes ~1-2 seconds. Subsequent queries are fast (~34ms).

To pre-load the model at startup:

{
  "mcpServers": {
    "enyal": {
      "command": "uvx",
      "args": ["enyal", "serve", "--preload"]
    }
  }
}

Database Locked Error

If you see "database is locked" errors, ensure only one MCP server instance is running per database file. Use different ENYAL_DB_PATH values for different projects if needed.

Permission Errors

On macOS/Linux, ensure the database directory exists and is writable:

mkdir -p ~/.enyal
chmod 755 ~/.enyal

Architecture

Enyal uses a unified SQLite database with:

  • Relational storage for metadata and attributes
  • sqlite-vec for vector similarity search (384-dim embeddings)
  • FTS5 for keyword search
  • WAL mode for concurrent access

See docs/ARCHITECTURE.md for detailed design decisions.

Development

# Clone repository
git clone https://github.com/seancorkum/enyal.git
cd enyal

# Install with dev dependencies
uv sync --all-extras

# Run tests
uv run pytest

# Type checking
uv run mypy src/enyal

# Linting
uv run ruff check src/enyal

Performance

Benchmarked on Intel Mac with Python 3.12:

Metric Target (p95) Measured (p95) Status
Cold start (model load + first query) <2000ms ~1500ms
Warm query latency <50ms ~34ms
Write latency <50ms ~34ms
Concurrent reads (4 threads) <150ms ~85ms
Memory (100k entries estimated) <500MB ~35MB

Embedding model: all-MiniLM-L6-v2 (22M params, 384 dimensions)

Run benchmarks:

uv run python benchmarks/benchmark_performance.py

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

enyal-0.1.2.tar.gz (209.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

enyal-0.1.2-py3-none-any.whl (25.1 kB view details)

Uploaded Python 3

File details

Details for the file enyal-0.1.2.tar.gz.

File metadata

  • Download URL: enyal-0.1.2.tar.gz
  • Upload date:
  • Size: 209.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for enyal-0.1.2.tar.gz
Algorithm Hash digest
SHA256 817b0606e363de52307723f2c24d98eaec6342c3893f7b7aa7e59cacdbcfe258
MD5 82dd8f901d3fde863a87be6acbac63ae
BLAKE2b-256 2c0858c6313d2ed9b9eaaa10de783fdd2f862d5868ae81ea9294c06aef361ac1

See more details on using hashes here.

Provenance

The following attestation bundles were made for enyal-0.1.2.tar.gz:

Publisher: publish.yml on S-Corkum/enyal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file enyal-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: enyal-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 25.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for enyal-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 46b14aa5212bb746169d04051247ae65e748cca2f2bfda2ade73d89b4e975fe0
MD5 c3e35a0e4c4a5c6029f295f203ff55b4
BLAKE2b-256 4ac250e90e7d2c1a6300de3664d033d3b73b3a9901ac185abc390dd212adb666

See more details on using hashes here.

Provenance

The following attestation bundles were made for enyal-0.1.2-py3-none-any.whl:

Publisher: publish.yml on S-Corkum/enyal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page