AI-powered summary generation plugin for MkDocs Material
Project description
MkDocs AI Summary Plugin
An intelligent MkDocs plugin that automatically generates AI-powered summaries for your documentation pages using multiple AI services including OpenAI, DeepSeek, Google Gemini, and GLM.
Features
- 🤖 Multiple AI Services: Support for OpenAI, DeepSeek, Google Gemini, and GLM
- 🚀 Smart Caching: Intelligent caching system to reduce API calls and costs
- 🎯 Flexible Configuration: Fine-grained control over which pages get summaries
- 🌍 Multi-language Support: Generate summaries in different languages
- 🔧 CI/CD Ready: Seamless integration with GitHub Actions and other CI/CD systems
- 📱 Responsive Design: Beautiful summary cards that work on all devices
- ⚡ Performance Optimized: Minimal impact on build times with smart caching
Installation
From PyPI (Recommended)
pip install mkdocs-ai-summary-wcowin
From Source
git clone https://github.com/Wcowin/Mkdocs-AI-Summary-Plus.git
cd Mkdocs-AI-Summary-Plus
pip install -e .
Quick Start
1. Configure your MkDocs
Add the plugin to your mkdocs.yml:
plugins:
- ai-summary:
ai_service: "deepseek" # or "openai", "gemini", "glm"
summary_language: "zh" # or "en"
cache_enabled: true
cache_expire_days: 30
enabled_folders:
- "docs"
exclude_patterns:
- "**/api/**"
- "**/reference/**"
2. Set up Environment Variables
Create a .env file in your project root:
# Choose one or more AI services
DEEPSEEK_API_KEY=your_deepseek_api_key
OPENAI_API_KEY=your_openai_api_key
GEMINI_API_KEY=your_gemini_api_key
GLM_API_KEY=your_glm_api_key
3. Build Your Documentation
mkdocs build
The plugin will automatically generate AI summaries for your pages and inject them into the content.
Configuration
Basic Configuration
plugins:
- ai-summary:
# AI Service Configuration
ai_service: "deepseek" # Primary AI service
fallback_services: # Fallback services if primary fails
- "openai"
- "gemini"
# Summary Configuration
summary_language: "zh" # Summary language (zh/en)
summary_length: "medium" # Summary length (short/medium/long)
# Caching Configuration
cache_enabled: true # Enable caching
cache_expire_days: 30 # Cache expiration in days
# File Selection
enabled_folders: # Folders to process
- "docs"
- "guides"
exclude_patterns: # Patterns to exclude
- "**/api/**"
- "**/reference/**"
exclude_files: # Specific files to exclude
- "index.md"
- "404.md"
# Environment Configuration
local_enabled: true # Enable in local development
ci_enabled: true # Enable in CI/CD
ci_cache_only: false # Only use cache in CI (no new API calls)
ci_fallback_summary: true # Use fallback summary in CI if no cache
Advanced Configuration
plugins:
- ai-summary:
# Custom API Endpoints
custom_endpoints:
deepseek:
base_url: "https://api.deepseek.com"
model: "deepseek-chat"
openai:
base_url: "https://api.openai.com/v1"
model: "gpt-3.5-turbo"
# Content Processing
max_content_length: 8000 # Maximum content length for AI processing
summary_position: "top" # Position of summary (top/bottom)
# Styling
summary_style:
theme: "material" # Summary card theme
show_icon: true # Show AI service icon
show_language: true # Show summary language
Environment Variables
Required API Keys
| Variable | Description | Required |
|---|---|---|
DEEPSEEK_API_KEY |
DeepSeek API key | If using DeepSeek |
OPENAI_API_KEY |
OpenAI API key | If using OpenAI |
GEMINI_API_KEY |
Google Gemini API key | If using Gemini |
GLM_API_KEY |
GLM API key | If using GLM |
Optional Configuration
| Variable | Description | Default |
|---|---|---|
AI_SUMMARY_DEBUG |
Enable debug logging | false |
AI_SUMMARY_TIMEOUT |
API request timeout (seconds) | 30 |
AI_SUMMARY_MAX_RETRIES |
Maximum API retry attempts | 3 |
CI/CD Integration
GitHub Actions
Add your API keys to GitHub Secrets and use them in your workflow:
name: Deploy Documentation
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: 3.x
- name: Install dependencies
run: |
pip install mkdocs-material mkdocs-ai-summary-wcowin
- name: Build documentation
env:
DEEPSEEK_API_KEY: ${{ secrets.DEEPSEEK_API_KEY }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
run: mkdocs build
- name: Deploy to GitHub Pages
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./site
AI Services
Supported Services
| Service | Model | Languages | Rate Limits |
|---|---|---|---|
| DeepSeek | deepseek-chat | zh, en | High |
| OpenAI | gpt-3.5-turbo, gpt-4 | zh, en | Medium |
| Google Gemini | gemini-pro | zh, en | High |
| GLM | glm-4 | zh, en | Medium |
Service Selection Strategy
- Primary Service: The main AI service specified in configuration
- Fallback Services: Used if primary service fails or is unavailable
- Automatic Retry: Built-in retry mechanism with exponential backoff
- Cost Optimization: Intelligent service selection based on content length
Caching System
How It Works
- Content Hashing: Each page's content is hashed to detect changes
- Service Configuration: Cache is invalidated when AI service settings change
- Expiration: Configurable cache expiration (default: 30 days)
- CI Optimization: Special caching behavior for CI/CD environments
Cache Management
# Clear all cache
rm -rf .ai_cache/
# Clear expired cache (automatic during build)
# No manual action needed
Troubleshooting
Common Issues
1. API Key Not Found
Error: No valid API key found for service 'deepseek'
Solution: Ensure your API key is set in the .env file or environment variables.
2. Rate Limit Exceeded
Warning: Rate limit exceeded for OpenAI, trying fallback service
Solution: Configure fallback services or reduce the number of pages being processed.
3. Content Too Long
Warning: Content too long for AI processing, truncating...
Solution: Increase max_content_length or split large pages into smaller ones.
Debug Mode
Enable debug logging:
export AI_SUMMARY_DEBUG=true
mkdocs build
Contributing
We welcome contributions! Please see our Contributing Guide for details.
Development Setup
git clone https://github.com/Wcowin/Mkdocs-AI-Summary-Plus.git
cd Mkdocs-AI-Summary-Plus
pip install -e ".[dev]"
Running Tests
pytest
Code Quality
black .
flake8 .
mypy .
License
This project is licensed under the MIT License - see the LICENSE file for details.
Changelog
See CHANGELOG.md for a list of changes and version history.
Support
Acknowledgments
- MkDocs - The static site generator this plugin extends
- MkDocs Material - The beautiful theme that inspired our design
- All the AI service providers for making this plugin possible
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mkdocs-ai-summary-wcowin-1.0.2.tar.gz.
File metadata
- Download URL: mkdocs-ai-summary-wcowin-1.0.2.tar.gz
- Upload date:
- Size: 21.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
73294c67e2b30356f0aa75156ab6781f83e33f533cc2ad6f62207e808db03e84
|
|
| MD5 |
97f9891328622a43d96457f2e951f75d
|
|
| BLAKE2b-256 |
948ba4f01cb380d1528531a409bd48bdffab942b29a8d3eee3df552011e37405
|
File details
Details for the file mkdocs_ai_summary_wcowin-1.0.2-py3-none-any.whl.
File metadata
- Download URL: mkdocs_ai_summary_wcowin-1.0.2-py3-none-any.whl
- Upload date:
- Size: 19.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3b43bc78b458b818d979ae43118a340574cbbfd8b34786417551eef08dc3d8d9
|
|
| MD5 |
3ccf6f726f14480b8f16b2cab463a3bd
|
|
| BLAKE2b-256 |
8f0be7f7e2520cfe144586fb2e033d82331a4eaafd39e95e72d7c5499bc05397
|