Skip to main content

Time Machine for AI Agents โ€” Cognitive Version Control for LLM context

Project description

Time Machine

๐Ÿง  Cognitive Version Control

Git for the AI Mind

Save. Branch. Rewind. Merge. โ€” Your AI agent just got an undo button.


pip install tm-ai

PyPI License: MIT Python Status PRs Welcome

GitHub Stars GitHub Forks GitHub Issues


๐Ÿค– Agent CLI ยท โœจ Features ยท ๐Ÿš€ Quick Start ยท ๐Ÿ“– CLI Reference ยท ๐Ÿค Contributing




Your AI coding agent is brilliant โ€” for about 20 minutes.

Then it forgets what it already fixed, contradicts its own plan, and loops on the same error for eternity.

Sound familiar?



๐Ÿง  What Is This?

CVC gives AI coding agents something they've never had: memory management that actually works.

Git, but for the AI's brain. Instead of versioning source code, CVC versions the agent's entire context โ€” every thought, every decision, every conversation turn โ€” as an immutable, cryptographic Merkle DAG.

The agent can checkpoint its reasoning, branch into risky experiments, rewind when stuck, and merge only the insights that matter.


๐Ÿ’พ Save ๐ŸŒฟ Branch ๐Ÿ”€ Merge โช Rewind
Checkpoint the agent's brain at any stable moment. Explore risky ideas in isolation. Main context stays clean. Merge learnings back โ€” not raw logs. Semantic, not syntactic. Stuck in a loop? Time-travel back instantly.


๐Ÿค– CVC Agent โ€” Your Own AI Coding Assistant

Claude Code on steroids โ€” with Time Machine built in.

Just type cvc and you're in. No setup menus, no extra commands.


cvc

That's it. One command. The agent launches.


CVC ships with a full agentic coding assistant directly in your terminal โ€” like Claude Code, but with the ability to save, branch, rewind, and search through your entire conversation history. It's not just an AI chat โ€” it's an AI with cognitive version control.


๐Ÿ”ง 15 Built-in Tools

The agent has access to powerful tools that let it work directly on your codebase:

Icon Tool What It Does
๐Ÿ“–read_fileRead files with optional line ranges for large files
โœ๏ธwrite_fileCreate or overwrite files, auto-creates directories
๐Ÿ”งedit_filePrecise find-and-replace edits with uniqueness validation
๐Ÿ–ฅ๏ธbashRun shell commands (PowerShell on Windows, bash on Unix)
๐Ÿ”globFind files by pattern (**/*.py, src/**/*.ts)
๐Ÿ“grepSearch file contents with regex + include filters
๐Ÿ“list_dirList directory contents to explore project structure
๐Ÿ“Šcvc_statusShow current branch, HEAD, and context state
๐Ÿ“œcvc_logView commit history โ€” snapshots of the conversation
๐Ÿ’พcvc_commitSave a checkpoint of the current conversation state
๐ŸŒฟcvc_branchCreate a branch to explore alternatives safely
โชcvc_restoreTime-travel back to any previous conversation state
๐Ÿ”€cvc_mergeMerge insights from one branch into another
๐Ÿ”Žcvc_searchSearch commit history for specific topics or discussions
๐Ÿ“‹cvc_diffCompare conversation states between commits

โŒจ๏ธ Slash Commands

While chatting with the agent, use these slash commands for quick actions:

Command Description
/helpShow all available slash commands
/statusView branch, HEAD, context size, provider & model
/logShow last 20 conversation checkpoints
/commit <message>Save a manual checkpoint of the conversation
/branch <name>Create and switch to a new conversation branch
/restore <hash>Time-travel back to a specific checkpoint
/search <query>Search all commits for a topic (e.g., /search auth login)
/compactCompress the conversation history, keeping recent context
/clearClear conversation history (CVC state preserved)
/model <name>Switch LLM model mid-conversation
/exitSave final checkpoint and exit cleanly

๐Ÿง  What Makes It Different

Claude Code / Codex Aider / Cursor ๐Ÿ”ฅ CVC Agent
  • Great tools
  • No memory across sessions
  • No branching / rollback
  • Single provider only
  • Context lost on crash
  • Great IDE integration
  • Session history is linear
  • No context time-travel
  • Can't search past chats
  • Provider-locked
  • All of their tools, plus:
  • โช Time-travel to any point
  • ๐ŸŒฟ Branch conversations
  • ๐Ÿ”Ž Search across all history
  • ๐Ÿ”€ Merge insights
  • ๐Ÿค– 4 providers supported
  • ๐Ÿ’พ Auto-checkpoint every 5 turns
  • ๐Ÿ“ฑ Session persistence

๐ŸŽจ Agent Options

cvc                                     # Launch agent with saved config
cvc agent                               # Same thing โ€” explicit subcommand
cvc agent --provider anthropic          # Force a specific provider
cvc agent --model claude-sonnet-4-5     # Override the model
cvc agent --api-key sk-ant-...          # Pass API key directly

๐Ÿ”„ Auto-Commit

The agent automatically saves checkpoints every 5 assistant turns (CVC_AGENT_AUTO_COMMIT=5). When you exit with /exit, a final checkpoint is saved. You never lose context.



๐Ÿ”ฅ The Problem We're Solving

The industry keeps making context windows bigger โ€” 4K โ†’ 32K โ†’ 128K โ†’ 1M+ tokens

It's not progress.

Research shows that after ~60% context utilisation, LLM reasoning quality falls off a cliff. One hallucination poisons everything that follows. Error cascades compound. The agent starts fighting itself.

A bigger window doesn't fix context rot. It just gives it more room to spread.

The Real Issue

AI agents have zero ability to manage their own cognitive state. They can't save their work. They can't explore safely. They can't undo mistakes. They're solving a 500-piece puzzle while someone keeps removing pieces from the table.


๐Ÿ“Š What the Research Shows

58.1%
Context reduction via branching
3.5ร—
Success rate improvement with rollback
~90%
Cost reduction through caching
~85%
Latency reduction
ContextBranch paper GCC paper Prompt caching Cached tokens skip processing


โš™๏ธ How It Works

CVC operates in two modes: as a standalone agent (just type cvc) or as a proxy between your favourite AI tool and the LLM provider.


%%{init: {'theme': 'dark', 'themeVariables': { 'fontSize': '14px', 'primaryColor': '#2C0000', 'primaryTextColor': '#E8D0D0', 'primaryBorderColor': '#8B0000', 'lineColor': '#CC3333', 'secondaryColor': '#5C1010', 'tertiaryColor': '#3D0000', 'edgeLabelBackground': '#2C0000'}}}%%

flowchart LR
    subgraph LOCAL["๐Ÿ–ฅ๏ธ  YOUR MACHINE"]
        direction TB

        subgraph AGENT_MODE["  CVC Agent (cvc)  "]
            AG["๐Ÿค– Terminal Agent\n15 tools ยท 4 providers"]
        end

        subgraph PROXY_MODE["  CVC Proxy ยท localhost:8000  "]
            direction TB
            R["โšก LangGraph Router"]
            R -->|CVC ops| E["๐Ÿง  Cognitive Engine"]
            R -->|passthrough| FWD["๐Ÿ“ก Forward to LLM"]
        end

        subgraph STORAGE["  .cvc/ directory  "]
            direction LR
            S1["๐Ÿ—„๏ธ SQLite\nCommit Graph"]
            S2["๐Ÿ“ฆ CAS Blobs\nZstandard"]
            S3["๐Ÿ” Chroma\nVectors"]
        end

        AG -- "direct API" --> CLOUD
        AG --> E
        R -- "HTTP" --> CLOUD
        E --> S1 & S2 & S3
    end

    subgraph CLOUD["  โ˜๏ธ LLM Provider  "]
        direction TB
        C1["Claude"]
        C2["GPT-5.2"]
        C3["Gemini 3 Pro"]
        C4["Ollama ๐Ÿ "]
    end

    style LOCAL fill:#1a0505,stroke:#8B0000,stroke-width:2px,color:#E8D0D0
    style AGENT_MODE fill:#2C0000,stroke:#CC3333,stroke-width:2px,color:#E8D0D0
    style PROXY_MODE fill:#2C0000,stroke:#8B0000,stroke-width:1px,color:#E8D0D0
    style STORAGE fill:#1a0505,stroke:#5C1010,stroke-width:1px,color:#E8D0D0
    style CLOUD fill:#1a0505,stroke:#CC3333,stroke-width:2px,color:#E8D0D0
    style AG fill:#8B0000,stroke:#CC3333,color:#ffffff
    style R fill:#5C1010,stroke:#CC3333,color:#ffffff
    style E fill:#3D5C10,stroke:#55AA55,color:#ffffff
    style FWD fill:#5C1010,stroke:#CC3333,color:#ffffff
    style S1 fill:#2C0000,stroke:#BB8844,color:#E8D0D0
    style S2 fill:#2C0000,stroke:#BB8844,color:#E8D0D0
    style S3 fill:#2C0000,stroke:#BB8844,color:#E8D0D0
    style C1 fill:#8B0000,stroke:#CC3333,color:#ffffff
    style C2 fill:#5C1010,stroke:#CC3333,color:#ffffff
    style C3 fill:#3D5C10,stroke:#55AA55,color:#ffffff
    style C4 fill:#444444,stroke:#888888,color:#ffffff

๐ŸŽฏ Three-Tiered Storage (All Local)

Tier What Why
๐Ÿ—„๏ธ SQLite Commit graph, branch pointers, metadata Fast traversal, zero-config, works everywhere
๐Ÿ“ฆ CAS Blobs Compressed context snapshots (Zstandard) Content-addressable, deduplicated, efficient
๐Ÿ” Chroma Semantic embeddings (optional) "Have I solved this before?" โ€” search by meaning

โœจ Everything stays in .cvc/ inside your project ๐Ÿ”’ No cloud โ€ข No telemetry โ€ข Your agent's thoughts are yours



๐Ÿš€ Get Started


Prerequisites

Python 3.11+ โ€ข Git (for VCS bridge features)


๐Ÿ“ฆ Install

Available on PyPI โ€” install in one command, no cloning required.

pip install tm-ai

That's it. The cvc command is now available globally.

๐Ÿ”ง More install options
# With uv (faster)
uv pip install tm-ai

# As an isolated uv tool (always on PATH, no venv needed)
uv tool install tm-ai

# With provider extras
pip install "tm-ai[anthropic]"     # Anthropic (Claude)
pip install "tm-ai[openai]"        # OpenAI (GPT)
pip install "tm-ai[google]"        # Google (Gemini)
pip install "tm-ai[all]"           # Everything
๐Ÿ› ๏ธ For contributors / local development
git clone https://github.com/mannuking/AI-Cognitive-Version-Control.git
cd AI-Cognitive-Version-Control
uv sync --extra dev           # or: pip install -e ".[dev]"

โ–ถ๏ธ Run

The simplest way โ€” just type cvc:

cvc

This launches the CVC Agent directly. If it's your first time, you'll be guided through setup first (pick your provider, model, and API key).

Or use specific commands:

# Launch the agent explicitly
cvc agent
cvc agent --provider openai --model gpt-5.2

# Launch external AI tools through CVC's proxy
cvc launch claude          # Claude Code CLI
cvc launch aider           # Aider
cvc launch codex           # OpenAI Codex CLI
cvc launch cursor          # Cursor IDE
cvc launch code            # VS Code

# One-command start (setup + init + serve proxy)
cvc up

Cross-platform: Works on Windows, macOS, and Linux. Global config is stored in the platform-appropriate location:

  • Windows: %LOCALAPPDATA%\cvc\config.json
  • macOS: ~/Library/Application Support/cvc/config.json
  • Linux: ~/.config/cvc/config.json

๐Ÿ”‘ Set Your API Key

ProviderBash / Linux / macOSPowerShell
Anthropic export ANTHROPIC_API_KEY="sk-ant-..." $env:ANTHROPIC_API_KEY = "sk-ant-..."
OpenAI export OPENAI_API_KEY="sk-..." $env:OPENAI_API_KEY = "sk-..."
Google export GOOGLE_API_KEY="AIza..." $env:GOOGLE_API_KEY = "AIza..."
Ollama No key needed โ€” just run ollama serve and ollama pull qwen2.5-coder:7b

Or save your keys via cvc setup โ€” they're stored securely on your machine.


๐Ÿ”Œ Connect External AI Tools (Proxy Mode)

If you prefer to use your own AI tool instead of the built-in agent, CVC runs as a transparent proxy that time-machines every conversation:

API-Based Tools (Proxy Mode)

Point your AI agent's API base URL to http://127.0.0.1:8000

CVC exposes OpenAI-compatible (/v1/chat/completions) AND Anthropic-native (/v1/messages) endpoints.

Auth-Based Tools (MCP Mode)

For IDEs that use login authentication (Antigravity, Windsurf, native Copilot), CVC runs as an MCP server:

cvc mcp                 # Start MCP server (stdio transport)
cvc mcp --transport sse # Start MCP server (HTTP/SSE transport)

Tool Auth Type How to Connect
๐Ÿ’Ž VS Code + Copilot GitHub Login BYOK: Ctrl+Shift+P โ†’ Manage Models โ†’ OpenAI Compatible or MCP: cvc mcp
๐Ÿš€ Antigravity Google Login MCP only: add cvc in MCP settings โ†’ cvc mcp
๐Ÿ–ฑ๏ธ Cursor API Key Override Settings โ†’ Models โ†’ Override Base URL โ†’ http://127.0.0.1:8000/v1
๐Ÿ„ Windsurf Account Login MCP only: add cvc in Cascade MCP settings โ†’ cvc mcp
๐ŸŸ  Claude Code CLI API Key export ANTHROPIC_BASE_URL=http://127.0.0.1:8000 โ€” native /v1/messages
โŒจ๏ธ Codex CLI API Key model_provider = "cvc" in ~/.codex/config.toml
๐Ÿ”„ Continue.dev / ๐Ÿค– Cline API Key Base URL โ†’ http://127.0.0.1:8000/v1, API Key โ†’ cvc
๐Ÿ› ๏ธ Aider / ๐ŸŒ Open WebUI API Key Standard OpenAI-compatible endpoint
๐Ÿฆœ LangChain / CrewAI / AutoGen API Key Use CVC's function-calling tools (GET /cvc/tools)

Auth pass-through: When Claude Code or Codex CLI sends its own API key, CVC forwards it to the upstream provider. No need to store API keys in CVC for these tools.

Run cvc connect for interactive, tool-specific setup instructions.



๐Ÿ“Ÿ CLI Reference


Command Description
cvc Launch the CVC Agent โ€” interactive AI coding assistant
cvc agent Same as above (explicit subcommand)
cvc agent --provider <p> Agent with a specific provider (anthropic, openai, google, ollama)
cvc agent --model <m> Agent with a model override
โ”€โ”€โ”€โ”€ Launch External Tools โ”€โ”€โ”€โ”€
cvc launch <tool> Zero-config โ€” auto-launch any AI tool through CVC
cvc up One command: setup + init + serve proxy
โ”€โ”€โ”€โ”€ Setup & Configuration โ”€โ”€โ”€โ”€
cvc setup Interactive setup wizard (choose provider & model)
cvc init Initialize .cvc/ in your project
cvc serve Start the Cognitive Proxy (API-based tools)
cvc mcp Start MCP server (auth-based IDEs)
cvc connect Interactive tool connection wizard
โ”€โ”€โ”€โ”€ Time Machine โ”€โ”€โ”€โ”€
cvc status Show branch, HEAD, context size
cvc log View commit history
cvc commit -m "message" Create a cognitive checkpoint
cvc branch <name> Create an exploration branch
cvc merge <branch> Semantic merge into active branch
cvc restore <hash> Time-travel to a previous state
cvc sessions View Time Machine session history
โ”€โ”€โ”€โ”€ Utilities โ”€โ”€โ”€โ”€
cvc install-hooks Install Git โ†” CVC sync hooks
cvc capture-snapshot Link current Git commit to CVC state
cvc doctor Health check your environment


๐Ÿ”— Git Integration

CVC doesn't replace Git โ€” it bridges with it.


Feature What It Does
๐ŸŒฒ Shadow Branches CVC state lives on cvc/main, keeping your main branch clean
๐Ÿ“ Git Notes Every git commit is annotated with the CVC hash โ€” "What was the AI thinking when it wrote this?"
๐Ÿ”„ post-commit hook Auto-captures cognitive state after every git commit
โฐ post-checkout hook Auto-restores the agent's brain when you git checkout an old commit

๐Ÿ“œ When you check out an old version of your code, CVC automatically restores the agent's context to what it was when that code was written.

โœจ True cognitive time-travel.



โฑ๏ธ Time Machine Mode

Like macOS Time Machine, but for AI agent conversations.

Every conversation is automatically saved. Nothing is ever lost.


When you use cvc (the agent) or cvc launch, Time Machine mode is enabled by default:

Feature Description
Auto-commit Every 5 assistant turns (agent) or 3 turns (proxy), configurable
Session tracking Detects which tool is connected, tracks start/end, message counts
Smart messages Auto-commits include turn number and conversation summary
Zero friction Just cvc and go โ€” or cvc launch claude for external tools
Session persistence Context restored from CVC on next launch
# View session history
cvc sessions

# Customize auto-commit interval (agent)
CVC_AGENT_AUTO_COMMIT=3 cvc agent    # Commit every 3 turns

# Customize auto-commit interval (proxy)
CVC_TIME_MACHINE_INTERVAL=5 cvc up   # Commit every 5 turns

# Disable time machine for external tools
cvc launch claude --no-time-machine

Supported External Tools

Tool Launch Command How It Connects
Claude Code CLI cvc launch claude Sets ANTHROPIC_BASE_URL โ†’ native /v1/messages
Aider cvc launch aider Sets OPENAI_API_BASE + model flag
OpenAI Codex CLI cvc launch codex Sets OPENAI_API_BASE
Gemini CLI cvc launch gemini Sets GEMINI_API_BASE_URL
Kiro CLI cvc launch kiro Sets OPENAI_API_BASE
Cursor cvc launch cursor Writes .cursor/mcp.json + opens IDE
VS Code cvc launch code Writes .vscode/mcp.json + configures BYOK
Windsurf cvc launch windsurf Writes MCP config + opens IDE


โšก Why It's Cheap

CVC structures prompts so committed history becomes a cacheable prefix. When you rewind to a checkpoint, the model doesn't reprocess anything it's already seen.


Metric โŒ Without CVC โœ… With CVC
๐Ÿ’ฐ Cost per restore Full price ~90% cheaper
โšก Latency per restore Full processing ~85% faster
๐Ÿ”„ Checkpoint frequency Impractical Economically viable

๐Ÿ”ฅ Works today with Anthropic, OpenAI, Google Gemini, and Ollama ๐Ÿ’ก Prompt caching optimised per provider



๐Ÿค– Supported Providers

Pick your provider. CVC handles the rest.


Provider Default Model Alternatives Notes
Anthropic claude-opus-4-6 claude-opus-4-5, claude-sonnet-4-5, claude-haiku-4-5 Prompt caching with cache_control
OpenAI gpt-5.2 gpt-5.2-codex, gpt-5-mini, gpt-4.1 Automatic prefix caching
Google gemini-3-pro-preview gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite OpenAI-compatible endpoint
Ollama qwen2.5-coder:7b qwen3-coder:30b, devstral:24b, deepseek-r1:8b 100% local, no API key needed


โš™๏ธ Configuration

All via environment variables โ€” no config files to manage


Variable Default What It Does
CVC_AGENT_ID sofia Agent identifier
CVC_DEFAULT_BRANCH main Default branch
CVC_ANCHOR_INTERVAL 10 Full snapshot every N commits (others are delta-compressed)
CVC_PROVIDER anthropic LLM provider
CVC_MODEL auto Model name (auto-detected per provider)
CVC_AGENT_AUTO_COMMIT 5 Agent auto-checkpoint interval (turns)
CVC_TIME_MACHINE_INTERVAL 3 Proxy auto-commit interval (turns)
ANTHROPIC_API_KEY โ€” Required for anthropic provider
OPENAI_API_KEY โ€” Required for openai provider
GOOGLE_API_KEY โ€” Required for google provider
CVC_HOST 127.0.0.1 Proxy host
CVC_PORT 8000 Proxy port
CVC_VECTOR_ENABLED false Enable semantic search (Chroma)


๐Ÿ—๏ธ Architecture


cvc/
โ”œโ”€โ”€ __init__.py            # Package root, version
โ”œโ”€โ”€ __main__.py            # python -m cvc entry point
โ”œโ”€โ”€ cli.py                 # Click CLI โ€” all commands, setup wizard, dark red theme
โ”œโ”€โ”€ proxy.py               # FastAPI proxy โ€” intercepts LLM API calls
โ”œโ”€โ”€ launcher.py            # Zero-config auto-launch for AI tools
โ”œโ”€โ”€ mcp_server.py          # Model Context Protocol server
โ”‚
โ”œโ”€โ”€ agent/                 # โ˜… Built-in AI coding agent (v0.6.0)
โ”‚   โ”œโ”€โ”€ __init__.py        # Exports run_agent()
โ”‚   โ”œโ”€โ”€ chat.py            # AgentSession REPL loop, slash commands, auto-commit
โ”‚   โ”œโ”€โ”€ llm.py             # Unified LLM client โ€” tool calling for all 4 providers
โ”‚   โ”œโ”€โ”€ tools.py           # 15 tool definitions in OpenAI function-calling schema
โ”‚   โ”œโ”€โ”€ executor.py        # Tool execution engine โ€” file ops, shell, CVC operations
โ”‚   โ”œโ”€โ”€ system_prompt.py   # Dynamic Claude Code-style system prompt builder
โ”‚   โ””โ”€โ”€ renderer.py        # Rich terminal rendering with #2C0000 dark red theme
โ”‚
โ”œโ”€โ”€ adapters/              # Provider-specific prompt formatting
โ”‚   โ”œโ”€โ”€ base.py            # Abstract BaseAdapter
โ”‚   โ”œโ”€โ”€ anthropic.py       # Anthropic adapter (prompt caching)
โ”‚   โ”œโ”€โ”€ openai.py          # OpenAI adapter
โ”‚   โ”œโ”€โ”€ google.py          # Google adapter
โ”‚   โ””โ”€โ”€ ollama.py          # Ollama adapter
โ”‚
โ”œโ”€โ”€ core/                  # Data layer
โ”‚   โ”œโ”€โ”€ models.py          # Pydantic schemas, config, Merkle DAG
โ”‚   โ””โ”€โ”€ database.py        # SQLite + CAS + Chroma storage
โ”‚
โ”œโ”€โ”€ operations/            # CVC engine
โ”‚   โ”œโ”€โ”€ engine.py          # Commit, branch, merge, restore
โ”‚   โ””โ”€โ”€ state_machine.py   # LangGraph command routing
โ”‚
โ””โ”€โ”€ vcs/                   # Git bridge
    โ””โ”€โ”€ bridge.py          # Shadow branches, Git notes, hooks


๐ŸŽฏ Who Is This For?


๐Ÿ‘ค Solo Developers ๐Ÿข Teams & Organizations ๐ŸŒ Open Source

Your AI stops losing context mid-session. Explore multiple approaches. Undo mistakes. Never re-explain the same thing twice.


Review the AI's reasoning, not just its output. Cryptographic audit trails. Shared cognitive state across team members. Compliance-ready.


See how an AI-generated PR was produced. Inspect for hallucination patterns. Build project knowledge bases from commit embeddings.



๐Ÿ—บ๏ธ Roadmap


Feature Status
๐Ÿค– Built-in Agent CLI โœ… Shipped in v0.6.0 โ€” 15 tools, 4 providers, slash commands
โ˜๏ธ Anthropic Adapter โœ… Claude Opus 4.6 / 4.5 / Sonnet / Haiku
โ˜๏ธ OpenAI Adapter โœ… GPT-5.2 / GPT-5.2-Codex / GPT-5-mini
โ˜๏ธ Google Gemini Adapter โœ… Gemini 3 Pro Preview / 2.5 Pro / 2.5 Flash
๐Ÿ  Ollama (Local) โœ… Qwen 2.5 Coder / Qwen 3 Coder / DeepSeek-R1 / Devstral
๐Ÿ”Œ MCP Server โœ… Native Model Context Protocol (stdio + SSE)
๐Ÿš€ Zero-config Launch โœ… cvc launch claude / aider / codex / cursor / etc.
๐Ÿ”— Git Bridge โœ… Shadow branches, Git notes, auto-hooks
๐ŸŽจ VS Code Extension ๐Ÿ”œ Visual commit graph and time-travel slider
๐Ÿ‘ฅ Multi-agent support ๐Ÿ”œ Shared CVC database with conflict resolution
โ˜๏ธ Cloud sync ๐Ÿ”œ S3/MinIO for team collaboration
๐Ÿ“Š Metrics dashboard ๐Ÿ”œ Cache hit rates, context utilisation, branch success rates


๐Ÿค Contributing

This repo is public and open to collaboration.

Whether you're fixing a typo or building an entirely new provider adapter โ€” contributions are welcome.


Fork โ†’ Branch โ†’ Commit โ†’ Push โ†’ PR


๐ŸŽฏ Areas Where Help Is Needed

Area Difficulty
๐Ÿ”Œ Additional Provider Adapters (Mistral, Cohere, etc.) ๐ŸŸก Medium
๐Ÿงช Tests & edge cases ๐ŸŸข Easyโ€“Medium
๐Ÿ–ฅ๏ธ VS Code Extension (commit graph visualisation) ๐Ÿ”ด Hard
๐Ÿ“Š Metrics & observability dashboard ๐ŸŸก Medium
๐Ÿ”’ Security audit ๐ŸŸ  Mediumโ€“Hard

๐Ÿ› ๏ธ Dev Setup

git clone https://github.com/YOUR_USERNAME/AI-Cognitive-Version-Control.git
cd AI-Cognitive-Version-Control
uv sync --extra dev


๐Ÿ“š Research

CVC is grounded in published research


Paper Key Finding
ContextBranch 58.1% context reduction via branching
GCC 11.7% โ†’ 40.7% success with rollback
Merkle-CRDTs Structural deduplication for DAGs
Prompt Caching Anthropic/OpenAI/Google token reuse


๐Ÿ“œ License

MIT โ€” see LICENSE





โœจ Because AI agents deserve an undo button. โœจ


โญ Star this repo if you believe in giving AI agents memory that actually works.


GitHub Stars GitHub Forks GitHub Watchers


Made with โค๏ธ by developers who got tired of AI agents forgetting what they just did.


โญ Star ยท ๐Ÿ› Bug ยท ๐Ÿ’ก Feature ยท ๐Ÿ”€ PR

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tm_ai-1.1.5.tar.gz (588.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tm_ai-1.1.5-py3-none-any.whl (137.1 kB view details)

Uploaded Python 3

File details

Details for the file tm_ai-1.1.5.tar.gz.

File metadata

  • Download URL: tm_ai-1.1.5.tar.gz
  • Upload date:
  • Size: 588.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.18

File hashes

Hashes for tm_ai-1.1.5.tar.gz
Algorithm Hash digest
SHA256 a7927a63a9ccf9078ea2ba78232ec10376717a8a33732d46ae20f625fb9f6478
MD5 19ae2eed8ad4c4f915a3438c3abb479c
BLAKE2b-256 409ffe64cc8646aa93331c93e1f1803005660c4d92f4858b34dfdd4f6af8b15d

See more details on using hashes here.

File details

Details for the file tm_ai-1.1.5-py3-none-any.whl.

File metadata

  • Download URL: tm_ai-1.1.5-py3-none-any.whl
  • Upload date:
  • Size: 137.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.18

File hashes

Hashes for tm_ai-1.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 c886f89f507af87bdc0f32d81cfd6e236bf712713cb0d8a20d84c32d38858142
MD5 63bfc445440e2766103cc466ceee04b7
BLAKE2b-256 e1db5a0a06fe5b7acd7351b93e8c9c1cc758197c8e1013915eab99d0c19ed7ea

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page