Skip to main content

Interactive Diagrams for Code

Project description

CodeBoarding

Website Discord GitHub

CodeBoarding generates interactive architectural diagrams from any codebase using static analysis + LLM agents. It's built for developers and AI agents that need to understand large, complex systems quickly.

  • Extracts modules and relationships via control flow graph analysis (LSP-based, no runtime required)
  • Builds layered abstractions with an LLM agent (OpenAI, Anthropic, Google Gemini, Ollama, and more)
  • Outputs Mermaid.js diagrams ready for docs, IDEs, and CI/CD pipelines

Supported languages: Python · TypeScript · JavaScript · Java · Go · PHP


Requirements

  • Python 3.12 or 3.13 — other versions are currently not supported.

Installation

The recommended way to install the CLI is with pipx, which automatically creates an isolated environment:

pipx install codeboarding --python python3.12

Alternatively, install into an existing virtual environment with pip:

pip install codeboarding

Installing into the global Python environment with pip is not recommended — it can cause dependency conflicts and will fail if the system Python is not 3.12 or 3.13.

Language server binaries are downloaded automatically on first use. To pre-install them explicitly (useful in CI or restricted environments):

codeboarding-setup

npm is required (used for Python, TypeScript, JavaScript, and PHP language servers). If npm is not found, it will be automatically installed during the setup. Binaries are stored in ~/.codeboarding/servers/ and shared across all projects.


Quick Start

CLI

# Analyze a local repository (output goes to /path/to/repo/.codeboarding/)
codeboarding full --local /path/to/repo

# Analyze a remote GitHub repository (cloned to cwd/repo_name/, output to cwd/repo_name/.codeboarding/)
codeboarding full https://github.com/user/repo

Python API

import json
from pathlib import Path
from diagram_analysis import DiagramGenerator, configure_models
from diagram_analysis.analysis_json import parse_unified_analysis

# Pass the key programmatically — shell env vars always take precedence if already set.
# Use the env-var name for whichever provider you want:
#   OPENAI_API_KEY, ANTHROPIC_API_KEY, GOOGLE_API_KEY, OLLAMA_BASE_URL, …
configure_models(api_keys={"OPENAI_API_KEY": "sk-..."})

repo_path = Path("/path/to/repo")
output_dir = repo_path / ".codeboarding"
output_dir.mkdir(parents=True, exist_ok=True)

# Generate the architectural diagram
generator = DiagramGenerator(
    repo_location=repo_path,
    temp_folder=output_dir,
    repo_name="my-project",
    output_dir=output_dir,
    depth_level=1,
)
[analysis_path] = generator.generate_analysis()

# Read and inspect the results
with open(analysis_path) as f:
    data = json.load(f)

root, sub_analyses = parse_unified_analysis(data)

print(root.description)
for comp in root.components:
    print(f"  {comp.name}: {comp.description}")
    if comp.component_id in sub_analyses:
        for sub in sub_analyses[comp.component_id].components:
            print(f"    └ {sub.name}")

Configuration

LLM provider keys and model overrides are stored in ~/.codeboarding/config.toml, created automatically on first run:

# ~/.codeboarding/config.toml

[provider]
# Uncomment exactly one provider key
# openai_api_key    = "sk-..."
# anthropic_api_key = "sk-ant-..."
# google_api_key    = "AIza..."
# ollama_base_url   = "http://localhost:11434"

[llm]
# Optional: override the default model for your active provider
# agent_model   = "gemini-3-flash"
# parsing_model = "gemini-3-flash"

Shell environment variables (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.) always take precedence over the config file, so CI/CD pipelines need no changes. For private repositories, set GITHUB_TOKEN in your environment.

Tip: Google Gemini 3 Pro consistently produces the best diagram quality for complex codebases.


CLI Reference

codeboarding full [REPO_URL ...]           # remote: clone + analyze
codeboarding full --local PATH             # local: analyze in-place
codeboarding incremental --local PATH      # re-analyze only changed parts
codeboarding partial --local PATH --component-id ID   # update one component
Option Description
--local PATH Analyze a local repository (output: PATH/.codeboarding/)
--depth-level INT Diagram depth (default: 1)
--force (full only) Force full reanalysis, skip cached static analysis
--base-ref REF / --target-ref REF (incremental only) Git refs to diff
--component-id ID (partial only) ID of the component to update
--binary-location PATH Custom path to language server binaries (overrides ~/.codeboarding/servers/)
--upload (full, remote only) Upload results to GeneratedOnBoardings repo
--enable-monitoring Enable run monitoring

Integrations

  • VS Code Extension — browse diagrams directly in your IDE
  • GitHub Action — generate docs on every push
  • MCP Server — serve concise architecture docs to AI coding assistants (Claude Code, Cursor, etc.)

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codeboarding-0.11.0.tar.gz (348.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codeboarding-0.11.0-py3-none-any.whl (399.2 kB view details)

Uploaded Python 3

File details

Details for the file codeboarding-0.11.0.tar.gz.

File metadata

  • Download URL: codeboarding-0.11.0.tar.gz
  • Upload date:
  • Size: 348.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for codeboarding-0.11.0.tar.gz
Algorithm Hash digest
SHA256 604e151e21c571d430ce63f4d8c9910eb941531562240b7c8c611bad2e134d90
MD5 441048d5ee2ddde6eae5f432532c9e1c
BLAKE2b-256 288ab1557161ec45cdb5c0219597320371b3cf68eacd4b8839a6e348286ea904

See more details on using hashes here.

File details

Details for the file codeboarding-0.11.0-py3-none-any.whl.

File metadata

  • Download URL: codeboarding-0.11.0-py3-none-any.whl
  • Upload date:
  • Size: 399.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for codeboarding-0.11.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1fe34b11295bef340230093b94453cc7527dc0ded274b6d0d93104d6b84bd5aa
MD5 09281fba445eddfcf38a977ea81c1dc4
BLAKE2b-256 0f318d8747e8afc16cdf7161bd136f614bca93a21953f15272ab65b34f64ae79

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page