Skip to main content

No project description provided

Project description

Code documentation generation with LLMs

Build Publish

Focus on writing your code, let LLMs write the documentation for you.
With just a few keystrokes in your terminal by using OpenAI or 100% local LLMs without any data leaks.

Built with langchain, treesitter, lama.cpp and ollama

doc_comments_ai_demo

✨ Features

  • 📝  Generate documentation comment blocks for all methods in a file
    • e.g. Javadoc, JSDoc, Docstring, Rustdoc etc.
  • ✍️   Generate inline documentation comments in method bodies
  • 🌳  Treesitter integration
  • 💻  Local LLM support
  • 🌐  Azure OpenAI support

[!NOTE]
Documentations will only be added to files without unstaged changes, so nothing is overwritten.

🚀 Usage

Create documentations for any method in a file specified by <RELATIVE_FILE_PATH> with GPT-3.5-Turbo model:

aicomment <RELATIVE_FILE_PATH>

Create also documentation comments in the method body:

aicomment <RELATIVE_FILE_PATH> --inline

Guided mode, confirm documentation generation for each method:

aicomment <RELATIVE_FILE_PATH> --guided

Use GPT-4 model:

aicomment <RELATIVE_FILE_PATH> --gpt4

Use GPT-3.5-Turbo-16k model:

aicomment <RELATIVE_FILE_PATH> --gpt3_5-16k

Use Azure OpenAI:

aicomment <RELATIVE_FILE_PATH> --azure-deployment <DEPLOYMENT_NAME>

Use local Llama.cpp:

aicomment <RELATIVE_FILE_PATH> --local_model <MODEL_PATH>

Use local Ollama:

aicomment <RELATIVE_FILE_PATH> --ollama-model <OLLAMA_MODEL>

[!NOTE]
How to download models from huggingface for local usage see Local LLM usage

[!NOTE]
If very extensive and descriptive documentations are needed, consider using GPT-4/GPT-3.5 Turbo 16k or a similar local model.

[!IMPORTANT]
The results by using a local LLM will highly be affected by your selected model. To get similar results compared to GPT-3.5/4 you need to select very large models which require a powerful hardware.

📚 Supported Languages

  • Python
  • Typescript
  • Javascript
  • Java
  • Rust
  • Kotlin
  • Go
  • C++
  • C
  • C#
  • Haskell

📋 Requirements

  • Python >= 3.9

📦 Installation

Install with pipx:

pipx install doc-comments-ai

1. OpenAI usage

Create your personal OpenAI API key and add it as $OPENAI_API_KEY to your environment with:

export OPENAI_API_KEY = <YOUR_API_KEY>

2. Azure OpenAI usage

Add the following variables to your environment:

export AZURE_API_BASE = "https://<your-endpoint.openai.azure.com/"
export AZURE_API_KEY = <YOUR_AZURE_OPENAI_API_KEY>
export AZURE_API_VERSION = "2023-05-15"

3. Local LLM usage with Llama.cpp

When using a local LLM no API key is required. On first usage of --local_model you will be asked for confirmation to intall llama-cpp-python with its dependencies. The installation process will take care of the hardware-accelerated build tailored to your hardware and OS. For further details see: installation-with-hardware-acceleration

To download a model from huggingface for local usage the most convenient way is using the huggingface-cli:

huggingface-cli download TheBloke/CodeLlama-13B-Python-GGUF codellama-13b-python.Q5_K_M.gguf

This will download the codellama-13b-python.Q5_K_M model to ~/.cache/huggingface/. After the download has finished the absolute path of the .gguf file is printed to the console which can be used as the value for --local_model.

[!IMPORTANT]
Since llama.cpp is used the model must be in the .gguf format.

✨ Contributing

If you are missing a feature or facing a bug don't hesitate to open an issue or raise a PR. Any kind of contribution is highly appreciated!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

doc_comments_ai-0.1.15.tar.gz (13.2 kB view details)

Uploaded Source

Built Distribution

doc_comments_ai-0.1.15-py3-none-any.whl (19.3 kB view details)

Uploaded Python 3

File details

Details for the file doc_comments_ai-0.1.15.tar.gz.

File metadata

  • Download URL: doc_comments_ai-0.1.15.tar.gz
  • Upload date:
  • Size: 13.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.9.18 Linux/6.2.0-1018-azure

File hashes

Hashes for doc_comments_ai-0.1.15.tar.gz
Algorithm Hash digest
SHA256 2a22192fc0164f8ea1ab754170184f5eff3f0df3e8f5d80a7422a9d427b7524e
MD5 f0f16fc9625aa0ac8450c1b17a6613f6
BLAKE2b-256 9bdb9ad974709d927abf68d6d716ca7cea74e5cc5854543fbcbbfa702b5ca0e1

See more details on using hashes here.

File details

Details for the file doc_comments_ai-0.1.15-py3-none-any.whl.

File metadata

  • Download URL: doc_comments_ai-0.1.15-py3-none-any.whl
  • Upload date:
  • Size: 19.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.9.18 Linux/6.2.0-1018-azure

File hashes

Hashes for doc_comments_ai-0.1.15-py3-none-any.whl
Algorithm Hash digest
SHA256 9e5a4f4858f7db98dceca07aa5c56d06db73c7f48e0707ba60ac4947fc7d1c25
MD5 242f5f1b60b8384d380013e7eed2f4a9
BLAKE2b-256 ea0301291c65cf6c47cf57c4253af6def7b6a6729ccd3ff97450ec61e41ecbcd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page