AI-powered code intelligence tool for analyzing and documenting codebases
Project description
Autodoc - AI-Powered Code Intelligence
Autodoc is an AI-powered code intelligence tool that analyzes Python and TypeScript codebases, enabling semantic search using OpenAI embeddings. It parses code using AST (Abstract Syntax Tree) analysis to extract functions, classes, and their relationships, then generates embeddings for intelligent code search.
Features
- ๐ Semantic Code Search - Search your codebase using natural language queries
- ๐ Python & TypeScript Support - Full AST analysis for both languages
- ๐ Comprehensive Analysis - Extract and analyze functions, classes, and their relationships
- ๐ค AI-Powered - Optional OpenAI embeddings for enhanced search capabilities
- ๐ง LLM Code Enrichment - Generate detailed descriptions using OpenAI, Anthropic/Claude, or Ollama
- ๐ Rich Documentation - Generate detailed codebase documentation in Markdown or JSON
- ๐ Fast & Efficient - Caches analysis results for quick repeated searches
- ๐ API Server - REST API for integration with other tools
- ๐ Graph Database - Neo4j integration for relationship visualization
- ๐ฆ Easy Integration - Use as CLI tool or Python library
- ๐จ Beautiful Output - Rich terminal UI with syntax highlighting
Quick Start
# Install from PyPI
pip install ai-code-autodoc
# Or install for development (requires uv)
git clone https://github.com/Emberfield/autodoc.git
cd autodoc
make setup
source .venv/bin/activate
Basic Usage
Command Line
# Quick workflow
autodoc analyze ./src # Analyze your codebase
autodoc generate # Create AUTODOC.md documentation
autodoc vector # Generate embeddings for search
autodoc search "auth logic" # Search with natural language
# LLM Enrichment (NEW!)
autodoc init # Create .autodoc.yml config
autodoc enrich --limit 50 # Enrich code with AI descriptions
autodoc generate # Now includes enriched content!
# Additional commands
autodoc check # Check setup and configuration
autodoc graph --visualize # Build graph database with visualizations
autodoc serve # Start REST API server
Context Packs
Context packs group related code by feature for focused search and AI context:
# Auto-detect and suggest packs based on codebase structure
autodoc pack auto-generate --save
# List all defined packs
autodoc pack list
# Build pack with embeddings for semantic search
autodoc pack build auth --embeddings
# Build all packs with AI summaries (requires API key)
autodoc pack build --all --embeddings --summary
# Search within a specific pack
autodoc pack query auth "user login flow"
# See pack dependencies
autodoc pack deps auth --transitive
# Check what changed since last index
autodoc pack diff auth
Impact Analysis
Analyze how file changes affect your codebase:
# Analyze impact of changed files
autodoc impact api/auth.py api/users.py --json
# Check pack indexing status
autodoc pack status
MCP Server
Autodoc includes an MCP (Model Context Protocol) server for AI assistant integration:
# Start MCP server
autodoc mcp-server
Available MCP Tools:
pack_list- List all context packspack_info- Get details about a packpack_query- Semantic search within a packpack_files- List files in a packpack_entities- List code entities in a packimpact_analysis- Analyze file change impactpack_status- Get indexing statuspack_deps- Get pack dependenciespack_diff- Check what changed since last index
Python API
from autodoc import SimpleAutodoc
import asyncio
async def main():
# Initialize autodoc
autodoc = SimpleAutodoc()
# Analyze a directory
summary = await autodoc.analyze_directory("./src")
print(f"Found {summary['total_entities']} code entities")
# Search with natural language
results = await autodoc.search("validation logic", limit=5)
for result in results:
print(f"{result['entity']['name']} - {result['similarity']:.2f}")
asyncio.run(main())
Configuration
Create a .autodoc.yaml file in your project root:
# LLM provider settings
llm:
provider: anthropic # or openai, ollama
model: claude-sonnet-4-20250514
temperature: 0.3
# Embeddings - use chromadb for free local embeddings
embeddings:
provider: chromadb
chromadb_model: all-MiniLM-L6-v2
dimensions: 384
# Cost controls for LLM operations
cost_control:
summary_model: claude-3-haiku-20240307 # Cheaper model for summaries
warn_entity_threshold: 100 # Warn on large packs
cache_summaries: true # Cache to avoid regenerating
dry_run_by_default: false
# Context packs for feature-based code grouping
context_packs:
- name: auth
display_name: Authentication
description: User authentication and authorization
files:
- "src/auth/**/*.py"
- "api/routes/auth.py"
security_level: critical
tags: [security, core]
API Keys (Optional)
For LLM-powered features (enrichment, summaries):
# Anthropic (recommended for summaries)
export ANTHROPIC_API_KEY=sk-ant-...
# Or OpenAI
export OPENAI_API_KEY=sk-...
Note: Embeddings use local ChromaDB by default - no API key needed for semantic search!
Development
Prerequisites
First, install uv - the fast Python package manager:
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# Or via Homebrew
brew install uv
Setup Development Environment
# Clone repository
git clone https://github.com/Emberfield/autodoc.git
cd autodoc
# Setup environment with uv
make setup
# Activate virtual environment
source .venv/bin/activate
# Run tests
make test
# Format code
make format
# Build package
make build
Available Make Commands
make help # Show all available commands
make setup # Setup development environment with uv
make setup-graph # Setup with graph dependencies
make analyze # Analyze current directory
make search QUERY="your search" # Search code
make test # Run all tests
make test-core # Run core tests only
make test-graph # Run graph tests only
make lint # Check code quality
make format # Format code
make build # Build package
# Graph commands (require graph dependencies)
make build-graph # Build code relationship graph
make visualize-graph # Create graph visualizations
make query-graph # Query graph insights
# Quick workflows
make dev # Quick development setup
make dev-graph # Development setup with graph features
Publishing & Deployment
Autodoc is published to PyPI with automated releases via GitHub Actions:
# Build package locally
make build
# Create a GitHub release to trigger automatic PyPI publish
# Or manually trigger the workflow from GitHub Actions
The package is available at pypi.org/project/ai-code-autodoc.
Architecture
Core Components
- SimpleASTAnalyzer - Parses Python files using AST to extract code entities
- OpenAIEmbedder - Handles embedding generation for semantic search
- SimpleAutodoc - Main orchestrator combining analysis and search
- CLI Interface - Rich command-line interface built with Click
Data Flow
- Analysis Phase: Python files โ AST parsing โ CodeEntity objects โ Optional embeddings โ Cache
- Search Phase: Query โ Embedding (if available) โ Similarity computation โ Ranked results
Advanced Features
Generate Comprehensive Documentation
# Generate markdown documentation
autodoc generate-summary --format markdown --output codebase-docs.md
# Generate JSON for programmatic use
autodoc generate-summary --format json --output codebase-data.json
Code Graph Analysis (Optional)
With additional dependencies, you can build and query a code relationship graph:
# Setup with graph dependencies
make setup-graph
source .venv/bin/activate
# Build graph (requires Neo4j running)
autodoc build-graph --clear
# Create visualizations
autodoc visualize-graph --all
# Query insights
autodoc query-graph --all
# Or use make commands
make build-graph
make visualize-graph
make query-graph
Graph Dependencies
The graph features require additional packages:
neo4j- Graph database drivermatplotlib- Static graph visualizationnetworkx- Graph analysisplotly- Interactive visualizationspyvis- Interactive network graphs
Install them with: make setup-graph or uv sync --extra graph
Example Output
Search Results
Search Results for 'authentication'
โโโโโโโโโโโโณโโโโโโโโโโโโโโโโโณโโโโโโโโโโโโโโโโโโโโโโณโโโโโโโโโโโโณโโโโโโโโโโโโโ
โ Type โ Name โ File โ Line โ Similarity โ
โกโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฉ
โ function โ authenticate โ auth/handler.py โ 45 โ 0.92 โ
โ class โ AuthManager โ auth/manager.py โ 12 โ 0.87 โ
โ function โ check_token โ auth/tokens.py โ 78 โ 0.83 โ
โโโโโโโโโโโโดโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโโโ
Analysis Summary
Analysis Summary:
files_analyzed: 42
total_entities: 237
functions: 189
classes: 48
has_embeddings: True
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Run tests (
make test) - Format code (
make format) - Commit changes (
git commit -m 'Add amazing feature') - Push to branch (
git push origin feature/amazing-feature) - Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
- Issues: GitHub Issues
- Documentation: CLAUDE.md for AI assistant guidance
- PyPI Package: ai-code-autodoc
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ai_code_autodoc-0.9.0.tar.gz.
File metadata
- Download URL: ai_code_autodoc-0.9.0.tar.gz
- Upload date:
- Size: 502.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c79b31c0b24f7947a6a3f28f016dd1af59314a2cd8b35fbf52888ab3ebb5366d
|
|
| MD5 |
4f03f66863979e094b9e0b1f2ab07339
|
|
| BLAKE2b-256 |
8cb3833c135c8303a25f54f56d6e50c71b7bcc4e1a7fd424a293af0b3c6c0c3d
|
Provenance
The following attestation bundles were made for ai_code_autodoc-0.9.0.tar.gz:
Publisher:
pypi-publish.yml on Emberfield/autodoc
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ai_code_autodoc-0.9.0.tar.gz -
Subject digest:
c79b31c0b24f7947a6a3f28f016dd1af59314a2cd8b35fbf52888ab3ebb5366d - Sigstore transparency entry: 804115291
- Sigstore integration time:
-
Permalink:
Emberfield/autodoc@daf86f154985fe8e12cf602e7ed828853ffd8cc1 -
Branch / Tag:
refs/tags/v0.9.0 - Owner: https://github.com/Emberfield
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@daf86f154985fe8e12cf602e7ed828853ffd8cc1 -
Trigger Event:
release
-
Statement type:
File details
Details for the file ai_code_autodoc-0.9.0-py3-none-any.whl.
File metadata
- Download URL: ai_code_autodoc-0.9.0-py3-none-any.whl
- Upload date:
- Size: 155.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3eabfed9c07833df418ce317230d1dde238bc7448ba891c6d385dc9c94f08cec
|
|
| MD5 |
93c4164e177ac1644715b260820a1909
|
|
| BLAKE2b-256 |
379b0b278647110aa2ebfa174ca222c98684888ff15f11b027c629294934b3c9
|
Provenance
The following attestation bundles were made for ai_code_autodoc-0.9.0-py3-none-any.whl:
Publisher:
pypi-publish.yml on Emberfield/autodoc
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ai_code_autodoc-0.9.0-py3-none-any.whl -
Subject digest:
3eabfed9c07833df418ce317230d1dde238bc7448ba891c6d385dc9c94f08cec - Sigstore transparency entry: 804115293
- Sigstore integration time:
-
Permalink:
Emberfield/autodoc@daf86f154985fe8e12cf602e7ed828853ffd8cc1 -
Branch / Tag:
refs/tags/v0.9.0 - Owner: https://github.com/Emberfield
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@daf86f154985fe8e12cf602e7ed828853ffd8cc1 -
Trigger Event:
release
-
Statement type: