Skip to main content

Turn a coding-AI chat session into a deployable, continuously-learning SME agent served over MCP.

Project description

scion

Turn a coding-AI chat session (Claude Code or Codex CLI) into a deployable, continuously-learning subject-matter-expert agent that any MCP-aware host can talk to.

scion extracts everything from a Claude Code or Codex CLI session — every message, every tool call, every memory file, every skill, every config — distills it into a portable, git-committable Agent Bundle, and serves the bundle as a stdio MCP server. Your colleague's Claude Code or Codex CLI calls /mcp__scion-<name>__ask (or the query tool) and gets a grounded answer. The bundle learns from every interaction via a nightly dream loop that consolidates new episodes into memory, gated by an evaluation ratchet so the agent cannot silently degrade.

Status

Pre-alpha. Building toward v0.1.0 (Phases 1–4 of the PRD).

Phase What State
0 Foundation, governance, context docs in progress
1 Claude Code extractor + bundle + CLI + LLMClient skeleton planned
2 Stdio FastMCP server + distill + memory tool wiring planned
3 Episode writeback + nightly dream loop with eval ratchet planned
4 Codex CLI extractor (closes v0.1.0) planned

Three-command quickstart (target — not yet runnable)

uv tool install scion
scion init demo
scion demo                  # full pipeline against synthetic fixture, no API key

Once Phase 1 ships, replace the third line with:

scion extract claude-code --cwd ~/proj/recon --output ./bundles/recon-sme
scion install claude-code ./bundles/recon-sme
# Restart Claude Code → /mcp__scion-recon-sme__ask "how do partial refunds work?"

How it compares

scion mem0 Letta / MemGPT Zep CE
Local-first, no daemon hosted daemon needs Neo4j
Git-committable bundles
Continuous learning loop eval ratchet (Karpathy) LLM-extracted facts tiered memory (managed) temporal graph
Memory shape files (markdown) facts (KV+vector) tiered store knowledge graph
Provider portability LiteLLM (any) OpenAI-default OpenAI-default provider-agnostic
Plugin extractor surface n/a n/a n/a
Source-tool side effects none (read-only) n/a n/a n/a

Supported on

Linux + macOS, Python 3.11/3.12/3.13. Windows is out of scope for v0.1.0.

License

Apache-2.0.

Contributing

See CONTRIBUTING.md. DCO sign-off required (git commit -s); a hook is provided.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scion-0.2.0.tar.gz (140.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scion-0.2.0-py3-none-any.whl (99.6 kB view details)

Uploaded Python 3

File details

Details for the file scion-0.2.0.tar.gz.

File metadata

  • Download URL: scion-0.2.0.tar.gz
  • Upload date:
  • Size: 140.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for scion-0.2.0.tar.gz
Algorithm Hash digest
SHA256 d713ec22c277b814c5a1860474c3770d923fdbf47889c2ff4ced275004b347da
MD5 d1b1103b7022cacc9ca9ce694452042e
BLAKE2b-256 56754c2f09a44fcac1959b36f2bcec3c5cc16b3c84a996c19cf89b978dae8e68

See more details on using hashes here.

Provenance

The following attestation bundles were made for scion-0.2.0.tar.gz:

Publisher: publish.yml on Tejas7/scion

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file scion-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: scion-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 99.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for scion-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b7ef38d6a4270a040ec4cf9e38456ce3116e884aba5821ac5bcfd99c7093cfb3
MD5 e5960b129b22d94b60514537b5fbc285
BLAKE2b-256 1867592c2d8629c6fc9f7d076bfa0552d76e1701b1d010fa0833a14122ab5d89

See more details on using hashes here.

Provenance

The following attestation bundles were made for scion-0.2.0-py3-none-any.whl:

Publisher: publish.yml on Tejas7/scion

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page