Skip to main content

A Static + LLM Hybrid CLI for reverse-engineering legacy codebases.

Project description

🧠 Codetrace-ai

The Autonomous System Architect for your Terminal and IDE.

Codetrace-ai is a deeply integrated, privacy-first AI agent that understands your entire codebase.

Unlike standard AI coding assistants that rely on naive text chunking, Codetrace builds a "Hybrid Brain"—combining a semantic Vector Database (ChromaDB) with a structural Graph Database (SQLite + NetworkX) using Tree-sitter. It doesn't just read your code; it understands what calls what, who owns what, and what breaks if you change something.

🎥 Action in Demo

Watch Demo


✨ What Codetrace Can Do For You

Codetrace acts as a highly knowledgeable senior engineer on your project, capable of:

  • Autonomous Code Research: Ask a question, and the agent proactively uses tools to search, read files, and analyze the codebase to find the exact answer.
  • Structural Call Graph Mapping: Navigates class and function definitions across 6+ languages (Python, JS, TS, Java, C++, Go) to see exactly how your application is wired.
  • Blast Radius Analysis: Analyzes the impact of code modifications before you make them, preventing unintended breakages.
  • Proposing Code Edits: Automatically writes and presents interactive code changes in the terminal, giving you a diff preview to approve or decline before saving.
  • Smart Delta Syncing: Reacts to changes in your code using SHA-256 hashing to lightning-fast re-indexes only the files you touched.
  • IDE Context Injection (MCP): Connects its powerful hybrid brain directly into Cursor, Windsurf, or Claude Code for in-editor assistance.

🔒 100% Local and Air-Gapped Capable

Codetrace-ai is built with a Privacy-First architecture. All parsing, embedding, and graph mapping happens directly on your machine.

It can be used 100% offline with zero internet connection, subject to the following terms and conditions:

  1. Local LLM Required: You must configure a local Large Language Model via a provider like Ollama (e.g., llama3.2 or deepseek-coder).
  2. Initial Model Download: By default, the system requires a brief initial internet connection to download local HuggingFace embedding models (bge-small and e5-small).
  3. True Air-Gapped Setup: If your machine has absolute zero internet access, you must:
    • Run codetrace init once on a connected machine to cache the embedding models.
    • Transfer the cache folder (~/.cache/huggingface/hub on Mac/Linux or C:\Users\<username>\.cache\huggingface\hub on Windows) to the offline machine via USB.
  4. Offline Flag: You must explicitly pass the --offline flag to strictly block internal telemetry and external requests (codetrace init --offline and codetrace chat --offline).

🚀 Installation

Requires Python 3.10+

pip install codetrace-ai

⚡ Quick Start

The ultimate frictionless setup. Navigate to any local codebase and type:

cd /path/to/your/project
codetrace init
- codetrace index . *(run this once at start and then only if you have added new files to the project)*
codetrace chat

(Note: codetrace init configures your provider, downloads models, indexes your code, and registers the MCP server in one go!)


🛠️ CLI Command Reference

Below is a quick reference of all available Codetrace commands and their 1-line descriptions:

  • codetrace init — Configures the LLM provider, downloads models, indexes the codebase, and registers MCP.
  • codetrace chat — Launches the interactive autonomous AI chat loop for your codebase.
  • codetrace index <PATH> — Forces a re-scan of a local folder or clones and indexes a GitHub URL.
  • codetrace config — View or update your LLM provider and API key configuration.
  • codetrace visualize — Generates and opens an interactive HTML graph map of your code architecture.
  • codetrace history — Lists all past architectural chat sessions for the current project.
  • codetrace export <ID> — Exports a specific chat session to the terminal or saves it as a Markdown file.

🔌 IDE Integration (MCP)

Codetrace-ai acts as a Model Context Protocol (MCP) server. Running codetrace init automatically installs this configuration into Cursor and Claude Code. You do not need to configure anything manually.

Once connected, your IDE gains access to specialized tools: search_codebase (semantic search) • get_symbol_relations (call graphs) • analyze_impact (blast radius) • write_file (propose edits) • git_diff

Using Windsurf? Windsurf does not support auto-registration yet. You can add Codetrace manually.

  1. Open your mcp.json file in Windsurf.
  2. Add the following to your mcpServers block:
"codetrace": {
  "command": "python",
  "args": [
    "/absolute/path/to/your/project/codetrace_mcp/server.py",
    "--project",
    "/absolute/path/to/your/project"
  ]
}

📂 File Structure

After initialization, your project will look like this:

your-project/
├── .codetrace/            ← created by codetrace
│   ├── chroma/            ← vector embeddings (ChromaDB)
│   ├── graph_metadata.db  ← code graph (SQLite)
│   └── chat_history.db    ← chat sessions (SQLite)
├── src/
├── ...
└── your code files

(Global config is stored in ~/.codetrace/config.json)

Run codetrace chat to start the interactive AI chat loop for your codebase. Once started, you can ask questions about your code and the AI will use its tools to find the answers and provide them to you in a conversational format. You can also use the /clear command to start a new session without exiting the chat. For first time to see if all files are indexed correctly or not ask: inspect_index To see the call graph of your code ask: visualize

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codetrace_ai-0.1.1.tar.gz (45.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codetrace_ai-0.1.1-py3-none-any.whl (51.4 kB view details)

Uploaded Python 3

File details

Details for the file codetrace_ai-0.1.1.tar.gz.

File metadata

  • Download URL: codetrace_ai-0.1.1.tar.gz
  • Upload date:
  • Size: 45.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for codetrace_ai-0.1.1.tar.gz
Algorithm Hash digest
SHA256 74b2303846fe6e59b7988c80060e78f2bc9aa83e21ae9d6229654c85b55b2a65
MD5 3b8a8f510fd932fbbd7260dd0d781757
BLAKE2b-256 662639f98b931fbd2b25c5d9bec1c112d1164b97bf13570a17e6372d882efe4c

See more details on using hashes here.

File details

Details for the file codetrace_ai-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: codetrace_ai-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 51.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for codetrace_ai-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 977f24f4e65b3a1486763ec5e05faee2d0e80e7eba709f969b771884c4272669
MD5 56ee702a19095d3ec0d5ebe7c99d8291
BLAKE2b-256 b02650187feda497feea41669fd9b831c0dbf157ae5614b546254d8006418b8f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page