Skip to main content

Your portable AI context. Carry your identity across every AI tool.

Project description


✦ aura
Your AI identity, portable across every tool.

PyPI CI License Python Stars

WebsiteQuick StartHow It WorksSupported ToolsCommandsSecurity


Why aura

Every AI tool starts from scratch. Claude doesn't know what ChatGPT learned. Cursor doesn't know your writing style. Gemini has no idea what framework you prefer. You re-explain yourself — every session, every tool, every time.

The industry is building memory and context solutions, but they solve the wrong layer:

Layer What it solves Examples
Memory What happened in past conversations Mem0, Zep, DeltaMemory
Context engineering What the AI should know right now LACP, Claudesidian, OpenClaw
Identity Who you are, across everything aura

Memory is session history. Context is prompt engineering. Identity is who you are — your stack, your style, your rules, your role — structured, portable, and owned by you.

aura is the identity layer. One CLI. One source of truth. Every AI tool.

Quick Start

pip install aura-ctx
aura quickstart

quickstart scans your machine, asks 5 questions, configures your AI tools, runs a security audit, and starts serving — one command.

30 seconds. No Docker. No database. No cloud account.

How It Works

You
 │
 ├── aura scan          Detects languages, frameworks, tools, projects
 ├── aura onboard       5 questions → writing style, role, rules
 ├── aura import        Pulls context from ChatGPT & Claude exports
 │
 ▼
Context Packs (YAML)    ~/.aura/packs/developer.yaml
 │                      ~/.aura/packs/writer.yaml
 │                      ~/.aura/packs/work.yaml
 │
 ▼
MCP Server              localhost:3847
 │
 ├──▶ Claude Desktop    (auto-configured)
 ├──▶ ChatGPT Desktop   (SSE)
 ├──▶ Cursor IDE        (auto-configured)
 └──▶ Gemini CLI        (auto-configured)

Context Packs

Your identity is organized into scoped YAML files. Each pack covers a domain — development, writing, work, or anything custom:

# ~/.aura/packs/developer.yaml
name: developer
scope: development
facts:
  - key: languages.primary
    value: [TypeScript, Python]
    type: skill
    confidence: high
  - key: editor
    value: Cursor
    type: preference
  - key: frameworks
    value: [Next.js, FastAPI, Tailwind, Supabase]
    type: skill
rules:
  - instruction: Always use TypeScript strict mode — no 'any'
    priority: 9
  - instruction: Dark theme by default, CSS variables for all colors
    priority: 8

You own these files. Human-readable. Git-friendly. They never leave your machine unless you choose otherwise.

Three-Level Token Delivery

AI tools have limited context windows. aura serves your identity at the right depth:

Level MCP Tool Tokens Use
1 get_identity_card ~50–100 Auto-called at conversation start
2 get_user_profile ~200–500 When the AI needs more detail
3 get_all_context ~1000+ Only when explicitly asked

The server instructs AI clients to start with the identity card and drill down only when needed.

Supported Tools

Tool Setup Transport
Claude Desktop aura setup — auto Streamable HTTP
Cursor IDE aura setup — auto Streamable HTTP
Gemini CLI aura setup — auto SSE
ChatGPT Desktop Developer Mode → add SSE URL SSE
Any MCP client Point to localhost:3847 HTTP or SSE
aura setup   # writes config for all detected tools
aura serve   # starts MCP server on localhost:3847
Claude Desktop

Auto-configured by aura setup. Manual config:

{
  "mcpServers": {
    "aura": { "url": "http://localhost:3847/mcp" }
  }
}
Cursor IDE

Auto-configured by aura setup. Manual config:

{
  "mcpServers": {
    "aura": { "url": "http://localhost:3847/mcp" }
  }
}
ChatGPT Desktop

Settings → Connectors → Advanced → Developer Mode:

SSE URL: http://localhost:3847/sse
Gemini CLI

Auto-configured by aura setup. Manual config:

{
  "mcpServers": {
    "aura": { "uri": "http://localhost:3847/sse" }
  }
}

Commands

Getting started

Command What it does
aura quickstart Full setup: scan → onboard → setup → audit → serve
aura scan Auto-detect your stack from tools, repos, and config files
aura onboard 5 questions to generate your context packs
aura setup Auto-configure Claude Desktop, Cursor, Gemini
aura serve Start the MCP server
aura serve --watch Start with hot-reload on YAML changes

Managing packs

Command What it does
aura list List all context packs
aura show <pack> Display a pack's contents
aura add <pack> <key> <value> Add a fact without editing YAML
aura edit <pack> Open a pack in $EDITOR
aura create <name> Create a new empty pack
aura create <name> -t <template> Create from a built-in template
aura delete <pack> Delete a pack
aura diff <a> <b> Compare two packs

Health & maintenance

Command What it does
aura doctor Check pack health — bloat, stale facts, duplicates, secrets
aura audit Scan packs for leaked API keys, tokens, credentials
aura audit --fix Auto-redact critical secrets
aura consolidate Merge duplicate facts, find contradictions across packs
aura decay Remove expired facts based on type-aware TTL

Import & export

Command What it does
aura import -s chatgpt <file> Import from a ChatGPT data export
aura import -s claude <file> Import from a Claude data export
aura extract <file> Extract facts from conversations using a local LLM
aura export <pack> -f system-prompt Universal LLM system prompt
aura export <pack> -f cursorrules .cursorrules file
aura export <pack> -f chatgpt ChatGPT custom instructions
aura export <pack> -f claude Claude memory statements

Security

aura is local-first. Your context never leaves your machine.

aura serve                              # localhost only, open
aura serve --token my-secret            # require Bearer token
aura serve --packs developer,writer     # expose only specific packs
aura serve --read-only                  # block all writes via MCP
aura serve --watch                      # auto-reload on pack changes

Secret detectionaura audit scans every fact and rule for leaked credentials before they reach an LLM. Catches 30+ patterns: AWS keys, GitHub tokens, OpenAI/Anthropic API keys, Slack tokens, database URLs, private keys, Bearer tokens, and more. The MCP server scrubs critical secrets automatically at serve time — even if you forget to audit.

  • Binds to 127.0.0.1 only — not reachable from the network
  • Optional Bearer token auth (--token or AURA_TOKEN env var)
  • Scoped serving — control which packs each tool sees
  • Read-only mode — AI reads your context, never writes to it
  • No telemetry. No analytics. No cloud. No tracking.

Architecture

aura/
├── cli.py           # 22 commands (Typer + Rich)
├── schema.py        # ContextPack, Fact, Rule (Pydantic)
├── mcp_server.py    # FastAPI MCP server (HTTP + SSE)
├── scanner.py       # Machine scanner with incremental hashing
├── onboard.py       # Interactive onboarding
├── pack.py          # Pack CRUD + templates
├── audit.py         # Secret detection engine (30+ patterns)
├── scan_cache.py    # SHA-256 content hashing for fast re-scans
├── watcher.py       # File watcher for hot-reload
├── doctor.py        # Pack health checker
├── consolidate.py   # Dedup + contradiction detection
├── extractor.py     # LLM-based extraction (Ollama / OpenAI)
├── diff.py          # Pack comparison
├── setup.py         # Auto-config for Claude, Cursor, Gemini
├── exporters/       # system-prompt, cursorrules, chatgpt, claude
└── importers/       # ChatGPT + Claude data importers

7,600+ lines of Python · 151 tests · 22 commands · MIT license

Roadmap

Shipped

  • Machine scanner — languages, frameworks, tools, projects, git identity
  • Context packs with typed facts, confidence levels, sources
  • MCP server — resources, tools, prompt templates
  • Auto-config for Claude Desktop, Cursor, Gemini CLI
  • ChatGPT Desktop support via SSE
  • Token auth, scoped serving, read-only mode
  • Import from ChatGPT + Claude data exports
  • LLM-based extraction (Ollama, OpenAI)
  • Pack health checker + consolidation engine
  • Memory decay with type-aware TTL
  • Secret detection and auto-redaction
  • Incremental scan with content hashing
  • File watcher (aura serve --watch)
  • Three-level token delivery

Next

  • TypeScript / npm package — npx aura-ctx
  • JSON Schema spec for context packs
  • Usage-based fact priority
  • Per-agent permissions
  • Pack templates (--template frontend, --template data-scientist)
  • Share via GitHub Gist
  • GraphRAG local knowledge graph
  • Cloud sync (opt-in, encrypted)
  • Team sharing

Contributing

git clone https://github.com/WozGeek/BettaAura.git
cd BettaAura
pip install -e ".[dev]"
pytest

Good first contributions: new export formats (Windsurf, Copilot, AGENTS.md), new importers (Gemini), template packs, docs. See CONTRIBUTING.md.

License

MIT — © Enoch Afanwoubo

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aura_ctx-0.3.0.tar.gz (75.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aura_ctx-0.3.0-py3-none-any.whl (67.6 kB view details)

Uploaded Python 3

File details

Details for the file aura_ctx-0.3.0.tar.gz.

File metadata

  • Download URL: aura_ctx-0.3.0.tar.gz
  • Upload date:
  • Size: 75.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for aura_ctx-0.3.0.tar.gz
Algorithm Hash digest
SHA256 6ec2572545cd3e266e60b6512b3b927affa63ac379e89d9d79461cce0f129571
MD5 3873457e4b68641b2b57de3c3e3f66ec
BLAKE2b-256 af34826a36e1939e48423193011c6d6288cf2553de7bcd67e898abc84b792eca

See more details on using hashes here.

File details

Details for the file aura_ctx-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: aura_ctx-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 67.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for aura_ctx-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4101bf4f04f43bf8f25baca84c4d46b197319d8b3152a2033d3e20d89df6f1a1
MD5 fea745e6743a790248996c0c634093bf
BLAKE2b-256 45108145fd5077c475dc72c4511c5b8ad444509a545ded970c3a4ec5d3958f48

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page