Turn a coding-AI chat session into a deployable, continuously-learning SME agent served over MCP.
Project description
scion
Turn a coding-AI chat session (Claude Code or Codex CLI) into a deployable, continuously-learning subject-matter-expert agent that any MCP-aware host can talk to.
scion extracts everything from a Claude Code or Codex CLI session — every message, every tool call, every memory file, every skill, every config — distills it into a portable, git-committable Agent Bundle, and serves the bundle as a stdio MCP server. Your colleague's Claude Code or Codex CLI calls /mcp__scion-<name>__ask (or the query tool) and gets a grounded answer. The bundle learns from every interaction via a nightly dream loop that consolidates new episodes into memory, gated by an evaluation ratchet so the agent cannot silently degrade.
Status
Pre-alpha. Building toward v0.1.0 (Phases 1–4 of the PRD).
| Phase | What | State |
|---|---|---|
| 0 | Foundation, governance, context docs | in progress |
| 1 | Claude Code extractor + bundle + CLI + LLMClient skeleton | planned |
| 2 | Stdio FastMCP server + distill + memory tool wiring |
planned |
| 3 | Episode writeback + nightly dream loop with eval ratchet | planned |
| 4 | Codex CLI extractor (closes v0.1.0) | planned |
Three-command quickstart (target — not yet runnable)
uv tool install scion
scion init demo
scion demo # full pipeline against synthetic fixture, no API key
Once Phase 1 ships, replace the third line with:
scion extract claude-code --cwd ~/proj/recon --output ./bundles/recon-sme
scion install claude-code ./bundles/recon-sme
# Restart Claude Code → /mcp__scion-recon-sme__ask "how do partial refunds work?"
How it compares
| scion | mem0 | Letta / MemGPT | Zep CE | |
|---|---|---|---|---|
| Local-first, no daemon | ✓ | hosted | daemon | needs Neo4j |
| Git-committable bundles | ✓ | ✗ | ✗ | ✗ |
| Continuous learning loop | eval ratchet (Karpathy) | LLM-extracted facts | tiered memory (managed) | temporal graph |
| Memory shape | files (markdown) | facts (KV+vector) | tiered store | knowledge graph |
| Provider portability | LiteLLM (any) | OpenAI-default | OpenAI-default | provider-agnostic |
| Plugin extractor surface | ✓ | n/a | n/a | n/a |
| Source-tool side effects | none (read-only) | n/a | n/a | n/a |
Supported on
Linux + macOS, Python 3.11/3.12/3.13. Windows is out of scope for v0.1.0.
License
Contributing
See CONTRIBUTING.md. DCO sign-off required (git commit -s); a hook is provided.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file scion-0.2.1.tar.gz.
File metadata
- Download URL: scion-0.2.1.tar.gz
- Upload date:
- Size: 142.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8b4d74576c5f8189f9153757bf5741fcc860810e159b2bcda992b96d718b9944
|
|
| MD5 |
6cd6b0988825d84de517be7444b7f28f
|
|
| BLAKE2b-256 |
285143a0df6ba59fee9d8201fd07ddc16eb4773f44336ec658ab75e4f02de524
|
Provenance
The following attestation bundles were made for scion-0.2.1.tar.gz:
Publisher:
publish.yml on Tejas7/scion
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
scion-0.2.1.tar.gz -
Subject digest:
8b4d74576c5f8189f9153757bf5741fcc860810e159b2bcda992b96d718b9944 - Sigstore transparency entry: 1440013165
- Sigstore integration time:
-
Permalink:
Tejas7/scion@b00322fca38e89ef7f7687dcc7c54c78681b40bf -
Branch / Tag:
refs/tags/v0.2.1 - Owner: https://github.com/Tejas7
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b00322fca38e89ef7f7687dcc7c54c78681b40bf -
Trigger Event:
push
-
Statement type:
File details
Details for the file scion-0.2.1-py3-none-any.whl.
File metadata
- Download URL: scion-0.2.1-py3-none-any.whl
- Upload date:
- Size: 100.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4354d47e19c576559b12afa4a3e254778337bac3cb1379860de0b07c0c4eb148
|
|
| MD5 |
e030463b8ebccfceb72f07ef2974c457
|
|
| BLAKE2b-256 |
2328f381efe3b41f5ec138b723f675f4a269cad1d1578f36fe1d1ed778b14201
|
Provenance
The following attestation bundles were made for scion-0.2.1-py3-none-any.whl:
Publisher:
publish.yml on Tejas7/scion
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
scion-0.2.1-py3-none-any.whl -
Subject digest:
4354d47e19c576559b12afa4a3e254778337bac3cb1379860de0b07c0c4eb148 - Sigstore transparency entry: 1440013173
- Sigstore integration time:
-
Permalink:
Tejas7/scion@b00322fca38e89ef7f7687dcc7c54c78681b40bf -
Branch / Tag:
refs/tags/v0.2.1 - Owner: https://github.com/Tejas7
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@b00322fca38e89ef7f7687dcc7c54c78681b40bf -
Trigger Event:
push
-
Statement type: