Skip to main content

Generate technical docs and AI agent skills (SKILL.md, AGENT.md) from any codebase — model-agnostic, monorepo-aware

Project description

⚒ RepoForge

AI-powered code analysis tool that generates technical documentation and AI agent skills from any codebase — works with any LLM.

PyPI version Python 3.10+ License: MIT


What it does

RepoForge analyzes your codebase and generates two types of output:

1. repoforge docs — Technical Documentation

Generates a complete Docsify-ready documentation site adapted to your project type:

Project type Specific chapters
Web service Data Models · API Reference
Frontend SPA Components · State Management
CLI tool Commands · Configuration
Data science Data Pipeline · Models & Training · Experiments
Library/SDK Public API · Integration Guide
Mobile app Screens & Navigation · Native Integrations
Infra/DevOps Resources · Variables · Deployment Guide
Monorepo Global chapters + per-layer subdocs (each layer gets its own set)

All types share: Overview · Quick Start · Architecture · Core Mechanisms · Dev Guide

Output is a docs/ folder ready for GitHub Pages — zero extra config.

2. repoforge skills — AI Agent Skills

Generates SKILL.md and AGENT.md compatible with:

  • Claude Code (.claude/skills/, .claude/agents/)
  • OpenCode (mirrored to .opencode/)
  • agent-teams-lite (skill registry at .atl/skill-registry.md)
  • Gentleman-Skills format (YAML frontmatter, Trigger:, Critical Patterns)

Installation

pip install repoforge-ai

Note: The CLI command is still repoforge after installation. The PyPI package name is repoforge-ai (the name repoforge was already taken).

Recommended: install ripgrep for 10-100x faster scanning:

brew install ripgrep          # macOS
sudo apt install ripgrep      # Ubuntu/Debian
scoop install ripgrep         # Windows

Quick start

# Generate docs (auto-detects language, Docsify-ready)
repoforge docs -w /path/to/repo --lang Spanish -o docs

# Preview locally (opens browser automatically)
repoforge docs -w . --serve

# Generate SKILL.md + AGENT.md for Claude Code
repoforge skills -w /path/to/repo

# Open skills browser
repoforge skills --serve-only

# See what would be generated (no LLM calls, free)
repoforge docs --dry-run
repoforge skills --dry-run

Model setup

RepoForge auto-detects your API key from env vars and picks the best available model.

GitHub Models — free with GitHub Copilot ⭐

export GITHUB_TOKEN=$(gh auth token)
repoforge docs -w . --model github/gpt-4o-mini --lang Spanish

Groq — free tier, very fast

export GROQ_API_KEY=gsk_...
repoforge docs -w . --model groq/llama-3.3-70b-versatile

Ollama — 100% local, free

ollama pull qwen2.5-coder:14b
repoforge docs -w . --model ollama/qwen2.5-coder:14b

Claude Haiku — cheap, ~$0.05/run

export ANTHROPIC_API_KEY=sk-ant-...
repoforge docs -w . --model claude-haiku-3-5

OpenAI

export OPENAI_API_KEY=sk-...
repoforge docs -w . --model gpt-4o-mini

docs command

repoforge docs [OPTIONS]

  -w, --working-dir DIR     Repo to analyze  [default: .]
  -o, --output-dir DIR      Output directory  [default: docs]
  --model TEXT              LLM model (auto from env if not set)
  --lang LANGUAGE           Documentation language  [default: English]
                            English|Spanish|French|German|Portuguese|
                            Chinese|Japanese|Korean|Russian|Italian|Dutch
  --name TEXT               Project name (auto-detected by default)
  --theme [vue|dark|buble]  Docsify theme  [default: vue]
  --serve                   Serve docs after generating (opens browser)
  --serve-only              Skip generation, serve existing docs
  --port INT                Server port  [default: 8000]
  --dry-run                 Plan only, no LLM calls, no files written
  -q, --quiet               Suppress progress output

Publish to GitHub Pages

RepoForge can publish docs automatically with GitHub Actions.

Version pinning for workflow stability

When using the RepoForge GitHub Action, pin to a tagged release instead of @main:

uses: JNZader/repoforge@v0.2.0

This keeps consumer workflows stable and reproducible. Update the tag explicitly when you want new behavior.

Safe defaults (recommended):

  1. Add .github/workflows/docs.yml to your repository.
  2. Push to main.
  3. By default, it runs in generate-only mode (deploy_mode=none) so it does not touch an existing Pages site.

To enable deploy (explicit opt-in):

  1. Set repository variable REPOFORGE_DOCS_DEPLOY_MODE to one of:
    • auto (if no live site -> deploy root, if live site -> deploy subpath)
    • main (force root deploy; may replace existing site)
    • subpath (deploy to /repoforge/ on gh-pages, preserving existing files)
  2. Set repository variable REPOFORGE_DOCS_CONFIRM_DEPLOY=true.
  3. Optional: set REPOFORGE_DOCS_SUBPATH_PREFIX (default: repoforge).

You can also run it manually from Actions (workflow_dispatch) with deploy_mode, confirm_deploy, and subpath_prefix inputs.

Note: subpath preservation uses gh-pages branch deploy. If your repo uses Pages build_type=workflow, the workflow will fall back to generate-only for safety.

Required Pages configuration by deploy mode:

deploy_mode Deployment mechanism Required Pages setting
none Generate only (no publish) Any
main actions/deploy-pages (artifact) Build and deployment: GitHub Actions
subpath peaceiris/actions-gh-pages (branch, keep_files) Deploy from a branchgh-pages + / (root)
auto Chooses main or subpath Must match chosen target (main => GitHub Actions, subpath => gh-pages)

If auto selects subpath but Pages is configured as GitHub Actions, the subpath branch publish may succeed but not be publicly visible.

After deploy, your docs are available at:

https://<your-user>.github.io/<your-repo>/

If deployed in subpath mode:

https://<your-user>.github.io/<your-repo>/<subpath-prefix>/

Example: adding docs to a repo with existing Pages

Your repo already has a live site at https://youruser.github.io/yourrepo/ and you want to add RepoForge docs without breaking it.

# 1. Copy the docs workflow to your repo
cp .github/workflows/docs.yml <your-repo>/.github/workflows/docs.yml

# 2. Set repo variables for safe subpath deploy
gh variable set REPOFORGE_DOCS_DEPLOY_MODE --body "auto" --repo youruser/yourrepo
gh variable set REPOFORGE_DOCS_CONFIRM_DEPLOY --body "true" --repo youruser/yourrepo
gh variable set REPOFORGE_DOCS_SUBPATH_PREFIX --body "docs" --repo youruser/yourrepo

# 3. Set Pages source to gh-pages branch (required for subpath mode)
#    Settings → Pages → Build and deployment → Deploy from a branch → gh-pages / (root)

# 4. Add the GH_MODELS_TOKEN secret (PAT with models:read scope)
gh secret set GH_MODELS_TOKEN --repo youruser/yourrepo

# 5. Push and let the workflow run
git add .github/workflows/docs.yml && git commit -m "ci: add repoforge docs" && git push

Result:

  • Your existing site stays at https://youruser.github.io/yourrepo/ (unchanged).
  • RepoForge docs appear at https://youruser.github.io/yourrepo/docs/.

Manual flow

Still supported if you prefer not to use the workflow:

repoforge docs -w . -o docs --lang English
git add docs/ && git commit -m "docs: generate documentation"
git push
# Settings → Pages → Source: /docs on main branch

skills command

repoforge skills [OPTIONS]

  -w, --working-dir DIR     Repo to analyze  [default: .]
  -o, --output-dir DIR      Output directory  [default: .claude]
  --model TEXT              LLM model
  --no-opencode             Skip mirroring to .opencode/
  --serve                   Open skills browser after generating
  --serve-only              Skip generation, open existing skills browser
  --port INT                Server port  [default: 8765]
  --dry-run                 Plan only, no LLM calls
  -q, --quiet               Suppress progress output

Output layout

.claude/
├── skills/
│   ├── backend/
│   │   ├── SKILL.md              ← layer-level skill
│   │   ├── auth/SKILL.md         ← per-module skill
│   │   └── reports/SKILL.md
│   └── frontend/
│       ├── SKILL.md
│       └── useGEELayers/SKILL.md
├── agents/
│   ├── orchestrator/AGENT.md     ← delegate-only orchestrator
│   ├── backend-agent/AGENT.md
│   └── frontend-agent/AGENT.md
└── SKILLS_INDEX.md

.opencode/                        ← identical mirror
.atl/skill-registry.md            ← agent-teams-lite registry

Monorepo support

Auto-detects layers from directory structure. Docs are generated hierarchically:

docs/
├── index.md              ← monorepo home + layer links
├── 01-overview.md        ← global tech stack, all layers
├── 03-architecture.md    ← how layers interact + Mermaid diagram
├── 06b-service-map.md    ← inter-service contracts
├── frontend/             ← classified as frontend_app
│   ├── index.md
│   ├── 05-components.md
│   └── 06-state.md
└── backend/              ← classified as web_service
    ├── index.md
    ├── 05-data-models.md
    └── 06-api-reference.md

repoforge.yaml — per-repo config

# repoforge.yaml (place in repo root)

# Override project name (default: from package.json / pyproject.toml)
project_name: "My App"

# Force project type (default: auto-detected)
# web_service | frontend_app | cli_tool | library_sdk | data_science
# mobile_app | desktop_app | infra_devops | monorepo | generic
project_type: web_service

# Override layer detection (default: auto from directory names)
layers:
  frontend: apps/web
  backend: apps/api
  shared: packages/shared

# Default language for docs
language: Spanish

# Default model
model: github/gpt-4o-mini

Python API

from repoforge import generate_artifacts, generate_docs

# Skills + agents
generate_artifacts(
    working_dir="/path/to/repo",
    output_dir=".claude",
    model="github/gpt-4o-mini",
    also_opencode=True,
)

# Documentation
generate_docs(
    working_dir="/path/to/repo",
    output_dir="docs",
    model="claude-haiku-3-5",
    language="Spanish",
)

How it works

1. SCAN   (free, no LLM) — detect layers, extract exports/imports, detect stack
2. PLAN   (free, no LLM) — select chapters by project type, rank modules
3. GENERATE (LLM calls)  — one call per chapter or skill
4. WRITE                 — Docsify-ready docs/ or .claude/ skills

The LLM only generates text. All structural analysis is deterministic.


Cost estimate

Model ~cost for medium repo
GitHub Models (Copilot) free
Groq free (rate-limited)
Ollama (local) free
Claude Haiku 3.5 ~$0.05
GPT-4o-mini ~$0.04
Claude Sonnet ~$0.50

Supported stacks

Language-agnostic — tested with Python, TypeScript, JavaScript, Go, Java, Kotlin, Rust, Ruby, PHP, and any monorepo combination.


License

MIT


Inspired by CodeViewX · Skill format: Gentleman-Skills · Agent pattern: agent-teams-lite

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

repoforge_ai-0.2.1.tar.gz (84.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

repoforge_ai-0.2.1-py3-none-any.whl (60.7 kB view details)

Uploaded Python 3

File details

Details for the file repoforge_ai-0.2.1.tar.gz.

File metadata

  • Download URL: repoforge_ai-0.2.1.tar.gz
  • Upload date:
  • Size: 84.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for repoforge_ai-0.2.1.tar.gz
Algorithm Hash digest
SHA256 714a74327d5651e4d70ed4c09dd1f3ad31546eb6407ef5a7a913fac79396c6ce
MD5 21724e3ec2757dec5a5bac44e0d77519
BLAKE2b-256 7095dcbd166732c865e4a61d21b8bfb0724dce304a46c2c23af5000e2bed8e9a

See more details on using hashes here.

File details

Details for the file repoforge_ai-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: repoforge_ai-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 60.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for repoforge_ai-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 86d69340fc11d523478aec06cf24eb9bd2d713bdebdbb625d3be752e485fe640
MD5 19b0bab5f3d3e11316c9494104752315
BLAKE2b-256 1af5e7a50bf0b987818f085be85c82bafa66031f38f11b3fc38f96591747ee2d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page