Skip to main content

Interactive Diagrams for Code

Project description

CodeBoarding

Website Discord GitHub

CodeBoarding generates interactive architectural diagrams from any codebase using static analysis + LLM agents. It's built for developers and AI agents that need to understand large, complex systems quickly.

  • Extracts modules and relationships via control flow graph analysis (LSP-based, no runtime required)
  • Builds layered abstractions with an LLM agent (OpenAI, Anthropic, Google Gemini, Ollama, and more)
  • Outputs Mermaid.js diagrams ready for docs, IDEs, and CI/CD pipelines

Supported languages: Python · TypeScript · JavaScript · Java · Go · PHP


Requirements

  • Python 3.12 or 3.13 — other versions are currently not supported.

Installation

The recommended way to install the CLI is with pipx, which automatically creates an isolated environment:

pipx install codeboarding --python python3.12

Alternatively, install into an existing virtual environment with pip:

pip install codeboarding

Installing into the global Python environment with pip is not recommended — it can cause dependency conflicts and will fail if the system Python is not 3.12 or 3.13.

Language server binaries are downloaded automatically on first use. To pre-install them explicitly (useful in CI or restricted environments):

codeboarding-setup

npm is required (used for Python, TypeScript, JavaScript, and PHP language servers). If npm is not found, it will be automatically installed during the setup. Binaries are stored in ~/.codeboarding/servers/ and shared across all projects.


Quick Start

CLI

# Analyze a local repository (output goes to /path/to/repo/.codeboarding/)
codeboarding --local /path/to/repo

# Analyze a remote GitHub repository (cloned to cwd/repo_name/, output to cwd/repo_name/.codeboarding/)
codeboarding https://github.com/user/repo

Python API

import json
from pathlib import Path
from diagram_analysis import DiagramGenerator, configure_models
from diagram_analysis.analysis_json import parse_unified_analysis

# Pass the key programmatically — shell env vars always take precedence if already set.
# Use the env-var name for whichever provider you want:
#   OPENAI_API_KEY, ANTHROPIC_API_KEY, GOOGLE_API_KEY, OLLAMA_BASE_URL, …
configure_models(api_keys={"OPENAI_API_KEY": "sk-..."})

repo_path = Path("/path/to/repo")
output_dir = repo_path / ".codeboarding"
output_dir.mkdir(parents=True, exist_ok=True)

# Generate the architectural diagram
generator = DiagramGenerator(
    repo_location=repo_path,
    temp_folder=output_dir,
    repo_name="my-project",
    output_dir=output_dir,
    depth_level=1,
)
[analysis_path] = generator.generate_analysis()

# Read and inspect the results
with open(analysis_path) as f:
    data = json.load(f)

root, sub_analyses = parse_unified_analysis(data)

print(root.description)
for comp in root.components:
    print(f"  {comp.name}: {comp.description}")
    if comp.component_id in sub_analyses:
        for sub in sub_analyses[comp.component_id].components:
            print(f"    └ {sub.name}")

Configuration

LLM provider keys and model overrides are stored in ~/.codeboarding/config.toml, created automatically on first run:

# ~/.codeboarding/config.toml

[provider]
# Uncomment exactly one provider key
# openai_api_key    = "sk-..."
# anthropic_api_key = "sk-ant-..."
# google_api_key    = "AIza..."
# ollama_base_url   = "http://localhost:11434"

[llm]
# Optional: override the default model for your active provider
# agent_model   = "gemini-3-flash"
# parsing_model = "gemini-3-flash"

Shell environment variables (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.) always take precedence over the config file, so CI/CD pipelines need no changes. For private repositories, set GITHUB_TOKEN in your environment.

Tip: Google Gemini 3 Pro consistently produces the best diagram quality for complex codebases.


CLI Reference

codeboarding [REPO_URL ...]           # remote: clone + analyze
codeboarding --local PATH             # local: analyze in-place
Option Description
--local PATH Analyze a local repository (output: PATH/.codeboarding/)
--depth-level INT Diagram depth (default: 1)
--incremental Smart incremental update (only re-analyze changed files)
--full Force full reanalysis, skip incremental detection
--partial-component-id ID Update a single component by its ID
--binary-location PATH Custom path to language server binaries (overrides ~/.codeboarding/servers/)
--upload Upload results to GeneratedOnBoardings repo (remote only)
--enable-monitoring Enable run monitoring

Integrations

  • VS Code Extension — browse diagrams directly in your IDE
  • GitHub Action — generate docs on every push
  • MCP Server — serve concise architecture docs to AI coding assistants (Claude Code, Cursor, etc.)

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

codeboarding-0.10.0.tar.gz (254.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

codeboarding-0.10.0-py3-none-any.whl (306.6 kB view details)

Uploaded Python 3

File details

Details for the file codeboarding-0.10.0.tar.gz.

File metadata

  • Download URL: codeboarding-0.10.0.tar.gz
  • Upload date:
  • Size: 254.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for codeboarding-0.10.0.tar.gz
Algorithm Hash digest
SHA256 172a850cd21b97da75ae066b02ba30bab9da419b8d5975b5ae8e4896cb09b7ef
MD5 53862a6241b7321c6644f36cc0155a67
BLAKE2b-256 5c6e4f650e01419e570b886e4dddb47a9f6c4edea0069acbe7565a87415b4a40

See more details on using hashes here.

File details

Details for the file codeboarding-0.10.0-py3-none-any.whl.

File metadata

  • Download URL: codeboarding-0.10.0-py3-none-any.whl
  • Upload date:
  • Size: 306.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for codeboarding-0.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 167c9c1b322b6d35d89ac1f42d5982daad9742d4983e3ea8551788df8f1808c1
MD5 379bdff4da6beb1bf6dee6e45fe245d0
BLAKE2b-256 ca2f689c116ad4d02d5682d8c0e331e046268a5e3422014da74d0360c3a3d4e4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page