Wiki's for nerds, by nerds
Project description
How it works
flowchart LR
subgraph Input["Code Source"]
Source1[GitHub Repository]
Source2[Local Directory]
end
subgraph Pipeline["Salt Docs Pipeline"]
Crawl[Crawl & Analyze]
Identify[LLM Identify Abstractions]
Generate[Generate Markdown Docs]
end
subgraph Output["Local Wiki"]
Docs[Markdown Files]
MCP[MCP Server]
end
subgraph Assistants["AI Assistants"]
Cursor[Cursor]
Claude[Claude]
Continue[Continue]
end
Source1 --> Crawl
Source2 --> Crawl
Crawl --> Identify --> Generate --> Docs --> MCP
MCP --> Cursor
MCP --> Claude
MCP --> Continue
Installation
Option 1: Install from PyPI
pip install salt-docs
Option 2: Install from source
git clone https://github.com/usesalt/salt-docs.git
cd salt-docs
pip install -e .
Quick Start
1. Initial Setup
Run the setup wizard to configure your API keys and preferences:
salt-docs init
2. Generate Documentation
Analyze GitHub repository
salt-docs run https://github.com/username/repo
Analyze local directory
salt-docs run /path/to/your/codebase
With custom options
salt-docs run https://github.com/username/repo --output /custom/path --language spanish --max-abstractions 10
Configuration
Salt Docs stores configuration in a per-user config file and uses your system's keyring for secure API key storage.
- macOS/Linux:
~/.config/saltdocs/config.json(or$XDG_CONFIG_HOME/saltdocs/config.json) - Windows:
%APPDATA%\saltdocs\config.json
Configuration Options
llm_provider: LLM provider to use (gemini, openai, anthropic, openrouter, ollama) - default: geminillm_model: Model name to use (e.g., "gemini-2.5-flash", "gpt-4o-mini", "claude-3-5-sonnet-20241022") - default: gemini-2.5-flashoutput_dir: Default output directorylanguage: Default language for generated docsmax_abstractions: Default number of abstractions to identifymax_file_size: Maximum file size in bytesuse_cache: Enable/disable LLM response cachinginclude_patterns: Default file patterns to includeexclude_patterns: Default file patterns to excludeollama_base_url: Custom Ollama base URL (optional, default: http://localhost:11434)
Managing Configuration
View Current Configuration
salt-docs config show
Update API Keys
# Update API key for any provider (interactive)
salt-docs config update-api-key gemini
salt-docs config update-api-key openai
salt-docs config update-api-key anthropic
salt-docs config update-api-key openrouter
# Legacy command (still works, redirects to update-api-key)
salt-docs config update-gemini-key
# Update GitHub token (interactive)
salt-docs config update-github-token
# Update GitHub token directly
salt-docs config update-github-token "your-token-here"
Update Other Settings
# Change LLM provider
salt-docs config set llm-provider openai
# Change LLM model
salt-docs config set llm-model gpt-4o-mini
# Change default language
salt-docs config set language spanish
# Change max abstractions
salt-docs config set max_abstractions 15
# Disable caching
salt-docs config set use_cache false
# Update output directory
salt-docs config set output_dir /custom/path
MCP Server Setup
Salt Docs includes an MCP (Model Context Protocol) server that exposes your generated documentation to AI assistants in IDEs like Cursor, Continue.dev, and Claude Desktop.
MCP Tools Available
The MCP server provides these tools:
list_docs- List all available documentation filesget_docs- Fetch the full content of a documentation file (by resource name or absolute path)search_docs- Full-text search across documentation (paths, names, and resource names)index_directories- Index directories for fast searching
Setup Instructions
Cursor
-
Open or create your MCP configuration file:
- macOS/Linux:
~/.cursor/mcp.json - Windows:
%APPDATA%\Cursor\mcp.json
- macOS/Linux:
-
Add the salt-docs server configuration:
{
"mcpServers": {
"salt-docs": {
"command": "salt-docs",
"args": ["mcp"]
}
}
}
-
Restart Cursor to load the MCP server.
-
The AI assistant in Cursor can now access your documentation using tools like:
- "What documentation do we have?"
- "Get me the documentation for 'SALT project"
- "Read the README documentation"
Claude Desktop
-
Open or create your Claude configuration file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
- macOS:
-
Add the salt-docs server configuration:
{
"mcpServers": {
"salt-docs": {
"command": "salt-docs",
"args": ["mcp"]
}
}
}
- Restart Claude Desktop to load the MCP server.
Troubleshooting
- Command not found: Make sure
salt-docsis in your PATH. You can verify by runningsalt-docs --versionin your terminal. - Server not starting: Ensure you've run
salt-docs initand have generated at least one documentation project. - No docs found: The MCP server discovers docs from your configured
output_dir. Runsalt-docs config showto check your output directory.
Testing the MCP Server
You can test the MCP server directly:
salt-docs mcp
This will start the server in stdio mode (for MCP clients). To test locally, you can use the test scripts in the tests/ directory.
LLM Provider Support
Salt Docs supports multiple LLM providers, allowing you to choose the best option for your needs:
Supported Providers
-
Google Gemini (default)
- Recommended models: gemini-2.5-pro, gemini-2.5-flash, gemini-1.5-pro, gemini-1.5-flash
- API key required: Yes (GEMINI_API_KEY)
-
OpenAI
- Recommended models: gpt-4o-mini, gpt-4.1-mini, gpt-5-mini, gpt-5-nano
- API key required: Yes (OPENAI_API_KEY)
- Supports o1 models with reasoning capabilities
-
Anthropic Claude
- Recommended models: claude-3-5-sonnet, claude-3-5-haiku, claude-3-7-sonnet (with extended thinking), claude-3-opus
- API key required: Yes (ANTHROPIC_API_KEY)
-
OpenRouter
- Recommended models: google/gemini-2.5-flash:free, meta-llama/llama-3.1-8b-instruct:free, openai/gpt-4o-mini, anthropic/claude-3.5-sonnet
- API key required: Yes (OPENROUTER_API_KEY)
- Access multiple models through a single API
-
Ollama (Local)
- Recommended models: llama3.2, llama3.1, mistral, codellama, phi3
- API key required: No (runs locally)
- Default URL: http://localhost:11434
- Perfect for privacy-sensitive projects or offline usage
Switching Providers
You can switch between providers at any time:
# Switch to OpenAI
salt-docs config set llm-provider openai
salt-docs config set llm-model gpt-4o-mini
salt-docs config update-api-key openai
# Switch to Ollama (local)
salt-docs config set llm-provider ollama
salt-docs config set llm-model llama3.2
# No API key needed for Ollama!
CLI Options
Required
run- GitHub repo URL, current open directory or local directory path--repoor--dir- GitHub repo URL or local directory path (depricated)
Optional
-n, --name- Project name (derived from repo/directory if omitted)-t, --token- GitHub personal access token-o, --output- Output directory (overrides config default)-i, --include- File patterns to include (e.g., ".py", ".js")-e, --exclude- File patterns to exclude (e.g., "tests/", "docs/")-s, --max-size- Maximum file size in bytes (default: 100KB)--language- Language for generated docs (default: "english")--no-cache- Disable LLM response caching--max-abstractions- Maximum number of abstractions to identify (default: 10)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file salt_docs-0.2.4.tar.gz.
File metadata
- Download URL: salt_docs-0.2.4.tar.gz
- Upload date:
- Size: 61.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4a28a54515da908b376bedb4239b617b3c241cdc0739a315474392f4725265a4
|
|
| MD5 |
26db0ccba849a89f4883cd373d3a12cf
|
|
| BLAKE2b-256 |
92fc3e121c430f0e276fd221b5e4a733cf4fd22c2271a973927a0e4ebb0f00cc
|
File details
Details for the file salt_docs-0.2.4-py3-none-any.whl.
File metadata
- Download URL: salt_docs-0.2.4-py3-none-any.whl
- Upload date:
- Size: 67.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
244c4598749f94a153bd7d3abe369934396d1b91ddcd4ce336dea0146b9c351f
|
|
| MD5 |
c40714034bf5c15a796cd2d940450783
|
|
| BLAKE2b-256 |
55db997e7166e3817c01504ed76bbe6f3683fd979181e7274842c825a46125c9
|