Skip to main content

Your portable AI context. Carry your identity across every AI tool.

Project description


✦ aura
Stop re-explaining yourself to every AI tool.

PyPI Downloads License Python Stars

WebsiteQuick StartHow It WorksSupported ToolsCommandsSecurity


Define who you are — your stack, your style, your rules — once, in plain YAML files you own. aura serves that identity to Claude, ChatGPT, Cursor, and Gemini through the Model Context Protocol. 100% local. No cloud. No lock-in.

Highlights

  • 30-second setuppip install aura-ctx && aura quickstart scans your machine, asks 5 questions, starts serving
  • 8 AI tools supported — Claude Desktop, Claude Code, Cursor, Windsurf, VS Code, ChatGPT, Gemini CLI, Codex
  • 14 templatesaura create -t frontend, data-scientist, founder, student, and 10 more
  • Smart token delivery — 3 levels (~50, ~500, ~1000+ tokens) so AI tools only load what they need
  • Freshness scoring — see how current each fact is (0–100), per-fact and per-pack
  • Secret scanning — auto-detects leaked API keys before they reach an LLM, redacts on serve
  • CLAUDE.md / AGENTS.md export — feed your identity to Claude Code, gstack, Codex, OpenClaw
  • File watcheraura serve --watch hot-reloads when you edit a YAML pack
  • Self-updateaura update keeps you on the latest version
  • No cloud, no telemetry, no tracking — YAML files on your machine, fully editable

Why aura

Every AI tool starts from scratch. Claude doesn't know what ChatGPT learned. Cursor doesn't know your writing style. Gemini has no idea what framework you prefer.

The industry is building solutions for this, but at the wrong layer:

Layer What it solves Examples
Memory What happened in past conversations Mem0, Zep, DeltaMemory
Context engineering What the AI should know right now LACP, Claudesidian, OpenClaw
Identity Who you are, across everything aura

Memory is session history. Context is prompt engineering. Identity is who you are — your stack, your style, your rules, your role — structured, portable, and owned by you.

aura is the identity layer.

Who this is for

You use multiple AI tools daily — Claude for thinking, Cursor for coding, ChatGPT for drafting, Gemini for research. You're tired of re-explaining your stack and style to each one.

You're a developer who values control — you want your context in plain text files you can read, edit, and version-control. Not locked inside a platform.

You're building with AI, not just using it — you care about token efficiency, MCP, and how your tools talk to each other.

If you've ever pasted your coding style into a system prompt and wished it could just follow you everywhere — that's what aura does.

Quick Start

pip install -U aura-ctx
aura quickstart

Here's what happens:

✦ aura quickstart

Step 1/5 — Scanning your machine...
  ✦ Detected 12 facts about your dev environment

Step 2/5 — Quick questions about you...
  What's your role? → Full-stack dev at Acme Corp
  How do you want AI to talk to you? → 1 (Direct, no fluff)
  What are you working on? → shipping v2 of our dashboard
  Any rules or pet peeves? → No corporate jargon, always use TypeScript
  What human languages? → English and French
  ✦ Created writer (2 facts, 3 rules)
  ✦ Created work (2 facts, 0 rules)

Step 3/5 — Configuring AI tools...
  ✦ Claude Desktop configured
  ✦ Cursor configured

Step 4/5 — Security audit...
  ✦ All clean — no secrets detected

Step 5/5 — Starting MCP server...
  ✦ http://localhost:3847/mcp
  Restart your AI tools — they know you now.

30 seconds. No Docker. No database. No cloud account.

See it work

After running aura quickstart, open Claude Desktop:

You:    What do you know about me?

Claude: I don't have any information about you!
        Memory is turned off for your account.

Now restart Claude (so it connects to aura's MCP server):

You:    What do you know about me?

Claude: Here's what I know from your aura context:

        Role: CS student
        Editor: Cursor
        Stack: TypeScript, Python, React, FastAPI, Tailwind
        Projects: aura, kipedia, hotepia
        Style: Technical, precise, no hand-holding
        Rules: Always use TypeScript strict mode, no 'any'

Same question. Completely different answer. That's aura.

How It Works

  You
   │
   ├── aura scan          Detects languages, frameworks, tools, projects
   ├── aura onboard       5 questions → writing style, role, rules
   ├── aura import        Pulls context from ChatGPT & Claude exports
   │
   ▼
  Context Packs (YAML)    ~/.aura/packs/developer.yaml
   │                      ~/.aura/packs/writer.yaml
   │                      ~/.aura/packs/work.yaml
   │
   ▼
  MCP Server              localhost:3847
   │
   ├──▶ Claude Desktop    (auto-configured)
   ├──▶ Claude Code       (auto-configured)
   ├──▶ Cursor IDE        (auto-configured)
   ├──▶ Windsurf IDE      (auto-configured)
   ├──▶ VS Code           (auto-configured)
   ├──▶ ChatGPT Desktop   (SSE, manual)
   ├──▶ Gemini CLI        (auto-configured)
   └──▶ Codex CLI         (auto-configured)

What's MCP? The Model Context Protocol is an open standard that lets AI tools connect to local data sources. aura uses it so Claude, Cursor, and others can read your context without any custom integration.

Context Packs

Your identity lives in scoped YAML files. Each pack covers a domain — development, writing, work, or anything custom:

# ~/.aura/packs/developer.yaml
name: developer
scope: development
facts:
  - key: languages.primary
    value: [TypeScript, Python]
    type: skill
    confidence: high
  - key: editor
    value: Cursor
    type: preference
  - key: frameworks
    value: [Next.js, FastAPI, Tailwind, Supabase]
    type: skill
  - key: style.code
    value: "Explicit types, functional patterns, minimal comments"
    type: style
rules:
  - instruction: Always use TypeScript strict mode — no 'any'
    priority: 9
  - instruction: Dark theme by default, CSS variables for all colors
    priority: 8
  - instruction: Error handling with specific types, not generic catches
    priority: 7

You own these files. Human-readable. Git-friendly. They never leave your machine unless you choose otherwise.

Three-Level Token Delivery

AI tools have limited context windows. aura serves your identity at the right depth:

Level MCP Tool Tokens When
1 get_identity_card ~50–100 Auto-called at conversation start
2 get_user_profile ~200–500 When the AI needs more detail
3 get_all_context ~1000+ Only when explicitly asked

The server instructs AI clients to start with the identity card and drill down only when needed. Most conversations never need the full dump.

Supported Tools

Tool Setup Transport
Claude Desktop aura setup — auto Streamable HTTP
Claude Code aura setup — auto Streamable HTTP
Cursor IDE aura setup — auto Streamable HTTP
Windsurf IDE aura setup — auto Streamable HTTP
VS Code aura setup — auto Copilot MCP
Gemini CLI aura setup — auto SSE
Codex CLI aura setup — auto Streamable HTTP
ChatGPT Desktop Developer Mode → add SSE URL SSE
Any MCP client Point to localhost:3847 HTTP or SSE
aura setup   # auto-configures all detected tools (alias: aura install)
aura serve   # starts MCP server on localhost:3847
Claude Desktop

Auto-configured by aura setup. Manual config:

{
  "mcpServers": {
    "aura": { "url": "http://localhost:3847/mcp" }
  }
}
Cursor IDE

Auto-configured by aura setup. Manual config:

{
  "mcpServers": {
    "aura": { "url": "http://localhost:3847/mcp" }
  }
}
ChatGPT Desktop

Settings → Connectors → Advanced → Developer Mode:

SSE URL: http://localhost:3847/sse
Gemini CLI

Auto-configured by aura setup. Manual config:

{
  "mcpServers": {
    "aura": { "uri": "http://localhost:3847/sse" }
  }
}

Commands

Getting started

Command What it does
aura quickstart Full setup: scan → onboard → setup → audit → serve
aura scan Auto-detect your stack from tools, repos, and config files
aura onboard 5 questions to generate your context packs
aura setup Auto-configure all detected AI tools (8 supported)
aura install Alias for aura setup
aura serve Start the MCP server
aura serve --watch Start with hot-reload on YAML changes
aura update Update aura to the latest PyPI version
aura version Show current version + check for updates

Managing packs

Command What it does
aura list List all context packs
aura show <pack> Display a pack's contents
aura add <pack> <key> <value> Add a fact without editing YAML
aura edit <pack> Open a pack in $EDITOR
aura create <n> Create a new empty pack
aura create <n> -t <template> Create from a built-in template
aura templates List all 14 available templates
aura delete <pack> Delete a pack
aura diff <a> <b> Compare two packs

Templates

14 built-in templates to get started fast. Each includes facts and AI interaction rules tailored to the profile.

Stack-specific: frontend, backend, data-scientist, mobile, devops, ai-builder

Role-specific: founder, student, marketer, designer

General-purpose: developer, writer, researcher, work

aura templates                         # list all available templates
aura create mydev -t frontend          # create a frontend dev pack
aura create research -t data-scientist # create a data science pack
aura create study -t student           # create a student pack

Every template is a starting point. Edit the generated YAML to match your actual stack and preferences.

Health & maintenance

Command What it does
aura doctor Check pack health — bloat, stale facts, duplicates, secrets
aura audit Scan packs for leaked API keys, tokens, credentials
aura audit --fix Auto-redact critical secrets
aura consolidate Merge duplicate facts, find contradictions across packs
aura decay Remove expired facts based on type-aware TTL

Import & export

Command What it does
aura import -s chatgpt <file> Import from a ChatGPT data export
aura import -s claude <file> Import from a Claude data export
aura extract <file> Extract facts from conversations using a local LLM
aura export <pack> -f system-prompt Universal LLM system prompt
aura export <pack> -f cursorrules .cursorrules file
aura export <pack> -f chatgpt ChatGPT custom instructions
aura export <pack> -f claude Claude memory statements
aura export <pack> -f claude-md CLAUDE.md section (Claude Code, gstack)
aura export <pack> -f agents-md AGENTS.md section (Codex, OpenClaw)
aura export <pack> -f claude-md CLAUDE.md section (Claude Code, gstack)
aura export <pack> -f agents-md AGENTS.md section (Codex, OpenClaw)

Security

aura is local-first. Your context never leaves your machine.

aura serve                              # localhost only, open
aura serve --token my-secret            # require Bearer token
aura serve --packs developer,writer     # expose only specific packs
aura serve --read-only                  # block all writes via MCP
aura serve --watch                      # auto-reload on pack changes

Secret detectionaura audit scans every fact and rule for leaked credentials before they reach an LLM. Catches 30+ patterns: AWS keys, GitHub tokens, OpenAI/Anthropic API keys, Slack tokens, database URLs, private keys, Bearer tokens, and more. The MCP server scrubs critical secrets automatically at serve time — even if you forget to audit.

  • Binds to 127.0.0.1 only — not reachable from the network
  • Optional Bearer token auth (--token or AURA_TOKEN env var)
  • Scoped serving — control which packs each tool sees
  • Read-only mode — AI reads your context, never writes to it
  • No telemetry. No analytics. No cloud. No tracking.

Architecture

aura/
├── cli.py           # 24 commands (Typer + Rich)
├── schema.py        # ContextPack, Fact, Rule (Pydantic)
├── mcp_server.py    # FastAPI MCP server (HTTP + SSE)
├── scanner.py       # Machine scanner with incremental hashing
├── onboard.py       # Interactive onboarding
├── pack.py          # Pack CRUD + templates
├── audit.py         # Secret detection engine (30+ patterns)
├── freshness.py     # Staleness scoring (0–100 per fact)
├── version_check.py # PyPI update checker with daily cache
├── scan_cache.py    # SHA-256 content hashing for fast re-scans
├── watcher.py       # File watcher for hot-reload
├── doctor.py        # Pack health checker
├── consolidate.py   # Dedup + contradiction detection
├── extractor.py     # LLM-based extraction (Ollama / OpenAI)
├── diff.py          # Pack comparison
├── setup.py         # Auto-config for 8 AI tools
├── exporters/       # system-prompt, cursorrules, chatgpt, claude, claude-md, agents-md
└── importers/       # ChatGPT + Claude data importers

8,500+ lines of Python · 151 tests · 24 commands · 14 templates · MIT license

Roadmap

Shipped

  • Machine scanner — languages, frameworks, tools, projects, git identity
  • Context packs with typed facts, confidence levels, sources
  • MCP server — resources, tools, prompt templates
  • Auto-config for 8 AI tools (Claude Desktop, Claude Code, Cursor, Windsurf, VS Code, Gemini CLI, Codex, ChatGPT)
  • Token auth, scoped serving, read-only mode
  • Import from ChatGPT + Claude data exports
  • LLM-based extraction (Ollama, OpenAI)
  • Pack health checker + consolidation engine
  • Memory decay with type-aware TTL
  • Secret detection and auto-redaction
  • Incremental scan with content hashing
  • File watcher (aura serve --watch)
  • Three-level token delivery
  • 14 built-in templates
  • CLAUDE.md / AGENTS.md exporters (Claude Code, gstack, Codex, OpenClaw)
  • Freshness scoring — per-fact and per-pack (0–100)
  • Self-update (aura update) with daily version check

Next

  • JSON Schema spec for context packs
  • Usage-based fact priority
  • Per-agent permissions
  • Chrome extension — capture context from browser AI conversations
  • Share via GitHub Gist
  • Cloud sync (opt-in, end-to-end encrypted)
  • Team sharing

Contributing

git clone https://github.com/WozGeek/aura-ctx.git
cd aura-ctx
pip install -e ".[dev]"
pytest

Good first issues:

  • New export format — add Windsurf or Continue.dev support (guide)
  • New importer — Gemini history export parsing
  • Pack templates — create domain-specific starter packs
  • JSON Schema — publish context-pack.schema.json to formalize the pack format
  • Translations — translate this README to French, Spanish, Portuguese, or Chinese

See CONTRIBUTING.md for the full guide.

License

MIT — © Enoch Afanwoubo

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aura_ctx-0.3.3.tar.gz (86.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aura_ctx-0.3.3-py3-none-any.whl (79.5 kB view details)

Uploaded Python 3

File details

Details for the file aura_ctx-0.3.3.tar.gz.

File metadata

  • Download URL: aura_ctx-0.3.3.tar.gz
  • Upload date:
  • Size: 86.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for aura_ctx-0.3.3.tar.gz
Algorithm Hash digest
SHA256 74c7ea89aed7780c79f36937a0653bfe6194ca299287bdbad6f21e922268817a
MD5 0522efe752744c48df52858cef3969ba
BLAKE2b-256 c266649a82f9d2864d589ec8a2986dc743d066cc2353687698ee64dc3f06b016

See more details on using hashes here.

File details

Details for the file aura_ctx-0.3.3-py3-none-any.whl.

File metadata

  • Download URL: aura_ctx-0.3.3-py3-none-any.whl
  • Upload date:
  • Size: 79.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for aura_ctx-0.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 28a2539d42def56921c29adde3c44aefd35a2582f2edc160a36ecd0e88cc6b71
MD5 d49ce364f52a596be77ef671e0ba4737
BLAKE2b-256 c2a813f2585c7064d98148ba2367586bcee5d4cdf1c7ffb1f65970c5d55dc377

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page