Skip to main content

Automated documentation generator for dbt projects using Google Gemini AI

Project description

DBT Autodoc Documentation

dbt-autodoc is the ultimate tool for Automated Documentation and Logging for your dbt projects. It combines the power of Google Gemini AI with a robust Database Logging system to ensure your documentation is always up-to-date, accurate, and auditable.

🌟 Why dbt-autodoc?

  • 🤖 Automatic AI Documentation: Generate comprehensive descriptions for your tables and columns automatically.
  • 💾 Database Logging & History: Every description is stored in a database (duckdb or postgres). This acts as a "Source of Truth" and provides a full history of changes.
  • 🔄 Full Synchronization: Seamlessly integrates with dbt-osmosis to keep your YAML files in sync with your SQL models.
  • 🔒 Protect Manual Work: Respects human-written documentation. If you write it, we lock it.
  • 👥 Team Ready: Use Postgres to share documentation cache across your entire team.

🛠️ Setup

  1. Install:

    pip install dbt-autodoc
    
  2. Configuration: Run dbt-autodoc --help to generate dbt-autodoc.yml. Important: Edit company_context in this file to give the AI knowledge about your business logic.

  3. Environment Variables:

    GEMINI_API_KEY=your_api_key_here
    POSTGRES_URL=postgresql://user:pass@host:port/db (optional)
    

📋 Recommended Workflow

For the best results, follow this step-by-step workflow to ensure accuracy and control:

  1. Preparation: Update your dbt project, generate the manifest, and context.

    dbt run && dbt docs generate
    # Edit dbt-autodoc.yml with company_context
    
  2. Sync Structure (No AI): Regenerate YAML files to match the SQL models. This ensures all new columns are present.

    dbt-autodoc --regenerate-yml
    
  3. Generate Model Descriptions (YAML): Generate AI descriptions for your models (tables/views).

    dbt-autodoc --generate-docs-model-ai --model-path models/staging
    
  4. Manual Review (Important): Open your YAML files. Review the structure and any existing descriptions. If you manually update a description here, it will be protected from AI overwrites in the next step.

  5. Generate Model Column Descriptions (YAML): Use AI to fill in the missing column descriptions.

    dbt-autodoc --generate-docs-model-columns-ai --model-path models/staging
    
  6. Propagate & Save: Run inheritance rules on the entire dbt project, then run the tool again to save the final state (including inherited descriptions) to the database.

    dbt-autodoc --regenerate-yml-with-inheritance
    dbt-autodoc --generate-docs-model-columns-ai --model-path models/staging
    
  7. Next Layer: Repeat steps 2-6 for models/intermediate, models/marts, etc.

🚀 Quick Start (Automated)

If you trust the process and just want to run everything at once:

dbt-autodoc --generate-docs-ai

🧠 How the AI Works

When generating a description for a column or table, the AI considers multiple inputs to produce the most accurate result:

  1. Company Context: The high-level business logic defined in your config.
  2. Model SQL: The actual code of the model being documented.
  3. Existing Descriptions: Any existing documentation or comments in the file.
  4. Upstream Logic: (Implicitly via Osmosis inheritance) Context from upstream models.

It synthesizes all these inputs to write a concise, technical description.

📖 Arguments Reference

Argument Description
--regenerate-yml Structure Only. Regenerate YAML files from dbt models. Does not sync to DB or call AI.
--regenerate-yml-with-inheritance Structure + Inheritance. Regenerate YAML files with inheritance enabled. Use this to propagate descriptions from upstream models.
--model-path Restrict processing to a specific directory (e.g. models/staging).
--generate-docs-model-ai Generate model descriptions in .yml files using AI.
--generate-docs-model-columns-ai Generate column descriptions in .yml files using AI.
--generate-docs-model Sync model descriptions in .yml files from cache (no AI).
--generate-docs-model-columns Sync column descriptions in .yml files from cache (no AI).
--generate-docs-ai 🔥 Full Auto. Runs the complete workflow: Model generation, YAML sync, and Column generation using AI.
--generate-docs 🔄 Full Sync. Runs the complete workflow using only the database cache (no AI).
--cleanup-db Reset Database. Wipes the description cache and history.
--concurrency Max threads for AI/DB requests (default: 10).
--sort-yml Sort keys in YAML files (name, description, columns for models; name, description for columns).

📄 License

MIT License - see LICENSE for details.

🙏 Attribution

Brought to you by JustDataPlease.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dbt_autodoc-1.0.21.tar.gz (19.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dbt_autodoc-1.0.21-py3-none-any.whl (16.2 kB view details)

Uploaded Python 3

File details

Details for the file dbt_autodoc-1.0.21.tar.gz.

File metadata

  • Download URL: dbt_autodoc-1.0.21.tar.gz
  • Upload date:
  • Size: 19.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.0

File hashes

Hashes for dbt_autodoc-1.0.21.tar.gz
Algorithm Hash digest
SHA256 251b30a3f0cb1c6912d179a13d7bf5f81fd10046d470fe87df2c1d502277dac3
MD5 22a705f69abd335768f46edb1129594e
BLAKE2b-256 ed64e9720f325006be41fd8ff54d07961425b912fea16853fa0b7404599239e6

See more details on using hashes here.

File details

Details for the file dbt_autodoc-1.0.21-py3-none-any.whl.

File metadata

  • Download URL: dbt_autodoc-1.0.21-py3-none-any.whl
  • Upload date:
  • Size: 16.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.0

File hashes

Hashes for dbt_autodoc-1.0.21-py3-none-any.whl
Algorithm Hash digest
SHA256 200c366f69d74e3d46331bb5ba01ee71f8e5399eca4a3d2478aee18db64ac3c9
MD5 1aae0bbca7e3a38e735a31ea2a77f5eb
BLAKE2b-256 129eff52c9d443f7230c8c8ce30241f1c0421fd803b38baaa2d491e254d8b956

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page