Skip to main content

No project description provided

Project description

Code documentation generation with LLMs

Build Publish

Focus on writing your code, let LLMs write the documentation for you.
With just a few keystrokes in your terminal by using OpenAI or 100% local LLMs without any data leaks.

Built with langchain, treesitter, lama.cpp and ollama

doc_comments_ai_demo

✨ Features

  • 📝  Generate documentation comment blocks for all methods in a file
    • e.g. Javadoc, JSDoc, Docstring, Rustdoc etc.
  • ✍️   Generate inline documentation comments in method bodies
  • 🌳  Treesitter integration
  • 💻  Local LLM support
  • 🌐  Azure OpenAI support

[!NOTE]
Documentations will only be added to files without unstaged changes, so nothing is overwritten.

🚀 Usage

Create documentations for any method in a file specified by <RELATIVE_FILE_PATH> with GPT-3.5-Turbo model:

aicomment <RELATIVE_FILE_PATH>

Create also documentation comments in the method body:

aicomment <RELATIVE_FILE_PATH> --inline

Guided mode, confirm documentation generation for each method:

aicomment <RELATIVE_FILE_PATH> --guided

Use GPT-4 model:

aicomment <RELATIVE_FILE_PATH> --gpt4

Use GPT-3.5-Turbo-16k model:

aicomment <RELATIVE_FILE_PATH> --gpt3_5-16k

Use Azure OpenAI:

aicomment <RELATIVE_FILE_PATH> --azure-deployment <DEPLOYMENT_NAME>

Use local Llama.cpp:

aicomment <RELATIVE_FILE_PATH> --local_model <MODEL_PATH>

Use local Ollama:

aicomment <RELATIVE_FILE_PATH> --ollama-model <OLLAMA_MODEL>

[!NOTE]
How to download models from huggingface for local usage see Local LLM usage

[!NOTE]
If very extensive and descriptive documentations are needed, consider using GPT-4/GPT-3.5 Turbo 16k or a similar local model.

[!IMPORTANT]
The results by using a local LLM will highly be affected by your selected model. To get similar results compared to GPT-3.5/4 you need to select very large models which require a powerful hardware.

📚 Supported Languages

  • Python
  • Typescript
  • Javascript
  • Java
  • Rust
  • Kotlin
  • Go
  • C++
  • C
  • C#
  • Haskell

📋 Requirements

  • Python >= 3.9

📦 Installation

Install in an isolated environment with pipx:

pipx install doc-comments-ai

If you are facing issues using pipx uou can also install directly from source through PyPI with

pip install doc-comments-ai

However, it is recommended to use pipx instead to benefit from isolated environments for the dependencies.
For further help visit the Troubleshooting section.

1. OpenAI usage

Create your personal OpenAI API key and add it as $OPENAI_API_KEY to your environment with:

export OPENAI_API_KEY = <YOUR_API_KEY>

2. Azure OpenAI usage

Add the following variables to your environment:

export AZURE_API_BASE = "https://<your-endpoint.openai.azure.com/"
export AZURE_API_KEY = <YOUR_AZURE_OPENAI_API_KEY>
export AZURE_API_VERSION = "2023-05-15"

3. Local LLM usage with Llama.cpp

When using a local LLM no API key is required. On first usage of --local_model you will be asked for confirmation to intall llama-cpp-python with its dependencies. The installation process will take care of the hardware-accelerated build tailored to your hardware and OS. For further details see: installation-with-hardware-acceleration

To download a model from huggingface for local usage the most convenient way is using the huggingface-cli:

huggingface-cli download TheBloke/CodeLlama-13B-Python-GGUF codellama-13b-python.Q5_K_M.gguf

This will download the codellama-13b-python.Q5_K_M model to ~/.cache/huggingface/. After the download has finished the absolute path of the .gguf file is printed to the console which can be used as the value for --local_model.

[!IMPORTANT]
Since llama.cpp is used the model must be in the .gguf format.

🛟 Troubleshooting

  • During installation with pipx

    pip failed to build package: tiktoken
    
    Some possibly relevant errors from pip install:
      error: subprocess-exited-with-error
      error: can't find Rust compiler
    
    Make sure the rust compiler is installed on your system from here.

🌟 Contributing

If you are missing a feature or facing a bug don't hesitate to open an issue or raise a PR. Any kind of contribution is highly appreciated!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

doc_comments_ai-0.1.16.tar.gz (13.9 kB view details)

Uploaded Source

Built Distribution

doc_comments_ai-0.1.16-py3-none-any.whl (19.7 kB view details)

Uploaded Python 3

File details

Details for the file doc_comments_ai-0.1.16.tar.gz.

File metadata

  • Download URL: doc_comments_ai-0.1.16.tar.gz
  • Upload date:
  • Size: 13.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.9.18 Linux/6.2.0-1019-azure

File hashes

Hashes for doc_comments_ai-0.1.16.tar.gz
Algorithm Hash digest
SHA256 6cbf7e7d5afe90c80283af443224821819547756d5b708ade86923a4dadcae62
MD5 a900b87cd1cce349b97a726dd8f54f0d
BLAKE2b-256 83c49027910b531daa3f1ae417c8239b827603647df8fef2eb431160d3fab5ef

See more details on using hashes here.

File details

Details for the file doc_comments_ai-0.1.16-py3-none-any.whl.

File metadata

  • Download URL: doc_comments_ai-0.1.16-py3-none-any.whl
  • Upload date:
  • Size: 19.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.9.18 Linux/6.2.0-1019-azure

File hashes

Hashes for doc_comments_ai-0.1.16-py3-none-any.whl
Algorithm Hash digest
SHA256 cfe7581fdfb8a7bbb0515fd9b7ea76b3603d6b310aeb13c60b609eb942696a02
MD5 a7f1f4d155e4a9da5387880a8616ae1e
BLAKE2b-256 ddd1a9764e1d3bbf1845c78ef94696ee3c0b67b066e27251fddff50aed58486a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page