Bolt-on correction primitive for AI coding agents.
Project description
Status: alpha (v0.1.0). Core engine, Claude Code adapter, and 5 starter scars are working. Demo GIF, PyPI release, and Codex adapter are on the roadmap. Read CHANGELOG.md for the current state.
Why this exists
A junior engineer reads the textbooks and learns the fundamentals — that is the floor. What turns the junior into a senior is the weight that mistakes leave behind: the migration that ran half-applied in production, the timezone bug that shipped to a customer, the build that broke at 2am. Those scars become heavier than any chapter of the book; they bend future decisions in a way pure knowledge cannot.
AI coding agents come into your project with a strong prior — billions of tokens of training, especially on code. But the way your assistant behaves on your codebase is not just that prior; it is shaped by every correction you make along the way. The catch is that those corrections rarely survive: the next session starts from training again, and the model regresses to its statistical default in any area where the correction carries less weight than the prior. A functional scar is the anchor that gives your correction enough weight to bend the next decision.
What is a Functional Scar?
A scar is what an operator's correction becomes when you make it deterministic. Not text presented to the model — code that runs outside the model, intercepts the moment of risk, and pushes back.
| System prompt | Memory / KB | Hook | Functional Scar | |
|---|---|---|---|---|
| Where does the rule live? | In context | In context | In code outside the model | In code outside the model |
| Does the model decide whether it applies? | Yes | Yes | No | No |
Does it survive /compact? |
Partial | Yes | Yes | Yes |
| Does it learn from its own fires? | No | No | Manual | Yes |
| Built directly from a real correction? | No | No | Manual | Yes — by design |
Functional Scars complement memory and skills, they do not compete with them. The companion paper Lucy Syndrome in LLM Agents explains the underlying framework — five invariants that distinguish corrections that persist from those that decay.
This repository is the first installable implementation of those invariants.
Quick start
pip install fscars # PyPI release pending — for now: pip install -e .
cd your-project
fscar init # creates .fscars/ + wires Claude Code
fscar list # 5 starter scars come pre-installed
Three quick wins to try right away:
# 1) Web dev — kill timezone regressions in handler code
fscar list | grep utc-timestamps
# 2) Data science — require explicit UTF-8 in pandas.read_csv
fscar list | grep csv-encoding
# 3) Marketing copy — block "we don't do X" framing
fscar list | grep avoid-negative-framing
Once installed, every Claude Code tool call passes through the engine. When a scar matches, the engine emits an additionalContext reminder (or blocks the call when the scar is severity block) and writes one JSON line to .fscars/logs/fires.jsonl.
Commands
| Command | Description |
|---|---|
fscar init |
Initialize .fscars/ and register the hook entrypoint |
fscar list |
Show registered scars + fire counts |
fscar log [-n N] |
Show the most recent fires (filter by --scar, --session) |
fscar stats |
Compute fire counts, latency p50/p99, tokens added |
fscar disable <scar_id> |
Disable without deleting (use --enable to restore) |
fscar doctor |
Diagnose installation and hook wiring |
fscar --version |
Print the installed version |
The hook entrypoint is python -m fscars.run_hook. Single command across every event type — no per-scar hook scripts.
How it works
┌─────────────────────────────────────────────────────────────┐
│ fscars.core │
│ payload · scar · engine · log · store · fire (Pydantic) │
└──────────────────────┬──────────────────────────────────────┘
│
┌──────────────┴──────────────┐
│ fscars.adapters/ │
│ claude_code (v0.1) │
│ codex (roadmap) │
│ cursor (community) │
└──────────────┬──────────────┘
│
▼
.claude/settings.json wired with one entrypoint:
python -m fscars.run_hook
The engine reads stdin, parses through the right adapter, dispatches to every matching scar, and emits the combined additionalContext plus exit code. A failure inside any scar is swallowed — the host harness must never crash because of fscars.
Cookbook
cookbook/scars/ ships starter scars you can use directly or copy-paste:
| File | What it does |
|---|---|
large_write_review.py |
Reminds the operator to self-review writes over 200 lines |
utc_timestamps.py |
Pushes back on time.Now() / new Date() in handler files |
csv_encoding.py |
Requires explicit encoding="utf-8" in pandas.read_csv |
avoid_negative_framing.py |
Blocks "we don't do X" patterns in marketing copy |
subagent_coverage_report.py |
Reminds the operator to ask subagents for a coverage report |
_template.py |
Copy-paste starting point for new scars |
See cookbook/scars/README.md for the contract and the 5-invariant checklist.
When NOT to use fscars
A scar only works when the correction satisfies the five invariants. If your fix is:
- Subjective ("I prefer tabs over spaces") — use
.editorconfigor a linter. - Proportional ("use async when it makes sense") — leave it to the model's judgment.
- One-off (the case has not repeated) — wait for the second occurrence first.
- Non-binary (cannot be checked deterministically) — keep it in your knowledge base.
These are the four cases the paper explicitly excludes. Adding a scar there creates noise without preventing anything.
Platforms
Currently supported:
- Claude Code (Anthropic) — full adapter, all event types
On the roadmap:
- Codex CLI (OpenAI) — the adapter API is public, awaiting hook stability upstream
- Cursor, Aider, Continue.dev — community adapters welcome
The core engine is platform-agnostic. Each adapter is a small glue layer (~300 LoC) that translates between platform-specific JSON shapes and the canonical HookPayload.
The research behind this
Functional Scars is the reference implementation of the framework described in Lucy Syndrome in LLM Agents: A Practitioner Framework for Cross-Session Correction Persistence (Del Puerto, 2026). The paper analyzes 163 findings from 17 production session logs, identifies 5 persistence invariants, and proposes a 3-layer implementation model.
If you want the why, read the paper. If you want the how, you are in the right place.
The first derivative essay From Memory to Scar (May 2026) extends the four-layer progression with Anthropic's Managed Agents Memory beta as a working example of Layer 3 industrialized.
Contributing
See CONTRIBUTING.md. New adapters and cookbook scars are especially welcome.
git clone https://github.com/Vdp89/fscars
cd fscars
pip install -e ".[dev]"
pytest -q
ruff check fscars cookbook tests
License
Apache 2.0 — see LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fscars-0.1.0.tar.gz.
File metadata
- Download URL: fscars-0.1.0.tar.gz
- Upload date:
- Size: 33.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
33d435245aa8360f2504272003c02d2ba17d752efce1c15f404a277298130cbb
|
|
| MD5 |
2d3a791435e9ecb96d02d69ca95d2742
|
|
| BLAKE2b-256 |
0278760d1365142c6c045842f3531df2ee4cb13ba960fdf357edf407995639a0
|
Provenance
The following attestation bundles were made for fscars-0.1.0.tar.gz:
Publisher:
release.yml on VDP89/fscars
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fscars-0.1.0.tar.gz -
Subject digest:
33d435245aa8360f2504272003c02d2ba17d752efce1c15f404a277298130cbb - Sigstore transparency entry: 1485007396
- Sigstore integration time:
-
Permalink:
VDP89/fscars@adff5de9ce06589a6b8f64e7e72aa821dc9e566a -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/VDP89
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@adff5de9ce06589a6b8f64e7e72aa821dc9e566a -
Trigger Event:
push
-
Statement type:
File details
Details for the file fscars-0.1.0-py3-none-any.whl.
File metadata
- Download URL: fscars-0.1.0-py3-none-any.whl
- Upload date:
- Size: 34.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1c053bc3b7ae7eeec065485b43075129ff0a897541e529445e0f8f73a5323c76
|
|
| MD5 |
6989461c2e8826e50d9e7ee50d8a7cdf
|
|
| BLAKE2b-256 |
9b5aab5bc913eaf9e9874aa7398511d0bbcfecec85f274c7f8606cc2b7311fe5
|
Provenance
The following attestation bundles were made for fscars-0.1.0-py3-none-any.whl:
Publisher:
release.yml on VDP89/fscars
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
fscars-0.1.0-py3-none-any.whl -
Subject digest:
1c053bc3b7ae7eeec065485b43075129ff0a897541e529445e0f8f73a5323c76 - Sigstore transparency entry: 1485007424
- Sigstore integration time:
-
Permalink:
VDP89/fscars@adff5de9ce06589a6b8f64e7e72aa821dc9e566a -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/VDP89
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@adff5de9ce06589a6b8f64e7e72aa821dc9e566a -
Trigger Event:
push
-
Statement type: