Skip to main content

Automated README file generator, powered by AI.

Project description

README-AI, Your AI-Powered Documentation Assistant

Designed for simplicity, customization, and developer productivity.

github-actions codecov pypi-version pepy-total-downloads license


🔗 Quick Links

  1. ⚡️ Introduction
  2. 👾 Demo
  3. ☄️ Features
  4. 🛸 Quickstart
  5. 🔡 Configuration
  6. 🤖 Examples
  7. 🔰 Contributing

[!IMPORTANT] ✨ See the Official Documentation for more details.


⚡️ Introduction

Objective

README-AI is a developer tool for automatically generating README markdown files using a robust repository processor engine and generative AI. Simply provide a repository URL or local path to your codebase, and a well-structured and detailed README file will be generated for you.

Motivation

This project aims to streamline the documentation process for developers, ensuring projects are properly documented and easy to understand. Whether you're working on an open-source project, enterprise software, or a personal project, README-AI is here to help you create high-quality documentation quickly and efficiently.


👾 Demo

Running from the command line:

readmeai-cli-demo

Running directly in your browser:

readmeai-streamlit-demo


☄️ Features

  • Automated Documentation: Synchronize data from third-party sources and generates documentation automatically.
  • Customizable Output: Dozens of options for styling/formatting, badges, header designs, and more.
  • Language Agnostic: Works across a wide range of programming languages and project types.
  • Multi-LLM Support: Compatible with OpenAI, Ollama, Anthropic, Google Gemini and Offline Mode.
  • Offline Mode: Generate a boilerplate README without calling an external API.
  • Markdown Best Practices: Leverage best practices in Markdown formatting for clean, professional-looking docs.

A few combinations of README styles and configurations:

custom-project-logo
--image custom --badge-color FF4B4B --badge-style flat-square --header-style classic


--image cloud --header-style compact --toc-style fold
cloud-db-logo
--align left --badge-style flat-square --image cloud
gradient-markdown-logo
--align left --badge-style flat --image gradient
custom-logo
--badge-style flat --image custom
skills-light
--badge-style skills-light --image grey
readme-ai-header
--badge-style flat-square
black-logo
--badge-style flat --image black
default-header
--image custom --badge-color 00ffe9 --badge-style flat-square --header-style classic
default-header
--image llm --badge-style plastic --header-style classic
default-header
--image custom --badge-color BA0098 --badge-style flat-square --header-style modern --toc-style fold

See the Configuration section for a complete list of CLI options.

📍 Overview
Overview

◎ High-level introduction of the project, focused on the value proposition and use-cases, rather than technical aspects.

llm-overview
✨ Features
Features Table

◎ Generated markdown table that highlights the key technical features and components of the codebase. This table is generated using a structured prompt template.

llm-features
📃 Codebase Documentation
Directory Tree

◎ The project's directory structure is generated using pure Python and embedded in the README. See readmeai.generators.tree. for more details.

directory-tree
File Summaries

◎ Summarizes key modules of the project, which are also used as context for downstream prompts.

file-summaries
🚀 Quickstart Instructions
Getting Started Guides

◎ Prerequisites and system requirements are extracted from the codebase during preprocessing. The parsers handles the majority of this logic currently.

prerequisites
Installation Guide

Installation, Usage, and Testing guides are generated based on the project's dependency files and codebase configuration.

installation
🔰 Contributing Guidelines
Contributing Guide

◎ Dropdown section that outlines general process for contributing to your project.

◎ Provides links to your contributing guidelines, issues page, and more resources.

◎ Graph of contributors is also included.

contributing-guidelines
Additional Sections

Project Roadmap, Contributing Guidelines, License, and Acknowledgements are included by default.

footer

🛸 Getting Started

System Requirements:

  • Python 3.9+
  • Package Manager/Container: pip, pipx, docker
  • LLM API Service: OpenAI, Ollama, Anthropic, Google Gemini, Offline Mode

Repository URL or Path:

Make sure to have a repository URL or local directory path ready for the CLI.

LLM API Service:

  • OpenAI: Recommended, requires an account setup and API key.
  • Ollama: Free and open-source, potentially slower and more resource-intensive.
  • Anthropic: Requires an Anthropic account and API key.
  • Google Gemini: Requires a Google Cloud account and API key.
  • Offline Mode: Generates a boilerplate README without making API calls.

🔩 Installation

Install readme-ai using your preferred package manager, container, or directly from the source.

Using pip

pip

 pip install readmeai

Using pipx

pipx

 pipx install readmeai

[! TIP]

Use pipx to install and run Python command-line applications without causing dependency conflicts with other packages!

Using docker

Pull the latest Docker image from the Docker Hub repository.

docker

 docker pull zeroxeli/readme-ai:latest

From source

Build readme-ai

Using bash

bash

 bash setup/setup.sh

Using poetry

Poetry

  1. Clone the repository:
 git clone https://github.com/eli64s/readme-ai
  1. Navigate to the readme-ai directory:
 cd readme-ai
  1. Install dependencies using poetry:
 poetry install
  1. Enter the poetry shell environment:
 poetry shell

Installing Optional Dependencies

To use the Anthropic and Google Gemini clients, install the optional dependencies.

Anthropic:

 pip install readmeai[anthropic]

Google Gemini:

 pip install readmeai[gemini]

⚙️ Usage

Environment Variables

OpenAI

Generate a OpenAI API key and set it as the environment variable OPENAI_API_KEY .

# Using Linux or macOS export OPENAI_API_KEY=<your_api_key>

# Using Windows set OPENAI_API_KEY=<your_api_key>

Ollama

Pull your model of choice from the Ollama repository:

 ollama pull mistral:latest

Start the Ollama server:

 export OLLAMA_HOST=127.0.0.1 && ollama serve

See all available models from Ollama here.

Anthropic

Generate an Anthropic API key and set the following environment variables:

 export ANTHROPIC_API_KEY=<your_api_key>

Google Gemini

Generate a Google API key and set the following environment variables:

 export GOOGLE_API_KEY=<your_api_key>

Running the CLI

Using pip

pip

With OpenAI:

 readmeai --api openai --repository https://github.com/eli64s/readme-ai

[! IMPORTANT] By default, the gpt-3.5-turbo model is used. Higher costs may be incurred when more advanced models.

With Ollama:

 readmeai --api ollama --model llama3 --repository https://github.com/eli64s/readme-ai

With Anthropic:

 readmeai --api anthropic -m claude-3-5-sonnet-20240620 -r https://github.com/eli64s/readme-ai

With Gemini:

 readmeai --api gemini -m gemini-1.5-flash -r https://github.com/eli64s/readme-ai

Adding more customization options:

 readmeai --repository https://github.com/eli64s/readme-ai \
           --output readmeai.md \
           --api openai \
           --model gpt-4 \
           --badge-color A931EC \
           --badge-style flat-square \
           --header-style compact \
           --toc-style fold \
           --temperature 0.9 \
           --tree-depth 2
           --image LLM \
           --emojis

Using docker

Running the Docker container with the OpenAI API:

docker

 docker run -it \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
-r https://github.com/eli64s/readme-ai

Using streamlit

Try readme-ai directly in your browser, no installation required. See the readme-ai-streamlit repository for more details.

From source

Using readme-ai

Using bash

bash

   conda activate readmeai
   python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai

Using poetry

Poetry

   poetry shell
   poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai

🧪 Testing

The pytest framework and nox automation tool are used for testing the application.

Using pytest

pytest

 make test

Using nox

 make test-nox

[!TIP] Use nox to test application against multiple Python environments and dependencies!


🔡 Configuration

Customize your README generation using these CLI options:

Option Description Default
--align Text align in header center
--api LLM API service provider offline
--badge-color Badge color name or hex code 0080ff
--badge-style Badge icon style type flat
--base-url Base URL for the repository v1/chat/completions
--context-window Maximum context window of the LLM API 3900
--emojis Adds emojis to the README header sections False
--header-style Header template style classic
--image Project logo image blue
--model Specific LLM model to use gpt-3.5-turbo
--output Output filename readme-ai.md
--rate-limit Maximum API requests per minute 10
--repository Repository URL or local directory path None
--temperature Creativity level for content generation 0.1
--toc-style Table of contents template style bullet
--top-p Probability of the top-p sampling method 0.9
--tree-depth Maximum depth of the directory tree structure 2

[!TIP] For a full list of options, run readmeai --help in your terminal.

🎨 Customization

To see the full list of customization options, check out the Configuration section in the official documentation. This section provides a detailed overview of all available CLI options and how to use them, including badge styles, header templates, and more.


🤖 Examples

Language/Framework Output File Input Repository Description
Python readme-python.md readme-ai Core readme-ai project
TypeScript & React readme-typescript.md ChatGPT App React Native ChatGPT app
PostgreSQL & DuckDB readme-postgres.md Buenavista Postgres proxy server
Kotlin & Android readme-kotlin.md file.io Client Android file sharing app
Streamlit readme-streamlit.md readme-ai-streamlit Streamlit UI for readme-ai app
Rust & C readme-rust-c.md CallMon System call monitoring tool
Docker & Go readme-go.md docker-gs-ping Dockerized Go app
Java readme-java.md Minimal-Todo Minimalist todo Java app
FastAPI & Redis readme-fastapi-redis.md async-ml-inference Async ML inference service
Jupyter Notebook readme-mlops.md mlops-course MLOps course repository
Apache Flink readme-local.md Local Directory Example using a local directory

See additional README files generated by readme-ai here


🏎💨 Project Roadmap

  • Release readmeai 1.0.0 with enhanced documentation management features.
  • Develop Vscode Extension to generate README files directly in the editor.
  • Develop GitHub Actions to automate documentation updates.
  • Add badge packs to provide additional badge styles and options.
    • Code coverage, CI/CD status, project version, and more.

🔰 Contributing

Contributions are welcome and encouraged! If interested, please begin by reviewing the resources below:

  • 💡 Contributing Guide: Learn about our contribution process, coding standards, and how to submit your ideas.
  • 💬 Start a Discussion: Have questions or suggestions? Join our community discussions to share your thoughts and engage with others.
  • 🐛 Report an Issue: Found a bug or have a feature request? Let us know by opening an issue so we can address it promptly.


📒 Changelog

Changelog


🎗 License

MIT License


🙌 Acknowledgments

⬆️ Top


Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

readmeai-0.5.90.tar.gz (148.2 kB view details)

Uploaded Source

Built Distribution

readmeai-0.5.90-py3-none-any.whl (166.2 kB view details)

Uploaded Python 3

File details

Details for the file readmeai-0.5.90.tar.gz.

File metadata

  • Download URL: readmeai-0.5.90.tar.gz
  • Upload date:
  • Size: 148.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.6

File hashes

Hashes for readmeai-0.5.90.tar.gz
Algorithm Hash digest
SHA256 78f8f9f85356c4fb9e04f6627ff52f62b3c3e7270c5817e6cf46adb8fdd6b10d
MD5 3c6a1e94c0c538b46161c8a45eaacdd0
BLAKE2b-256 d978996ec20470d258b6a608b4743fbf61926e4039e566e1c2de7b6451297dfb

See more details on using hashes here.

File details

Details for the file readmeai-0.5.90-py3-none-any.whl.

File metadata

  • Download URL: readmeai-0.5.90-py3-none-any.whl
  • Upload date:
  • Size: 166.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.6

File hashes

Hashes for readmeai-0.5.90-py3-none-any.whl
Algorithm Hash digest
SHA256 62c9b0a18a05ba0d2ff3d818de7b8d234d0224bacb7f14b7b6a5e2bcb0dfa9b4
MD5 824bff002993dcfb2e2227fff8e403ea
BLAKE2b-256 4f4780c72dcf4d43d713bf459c51d82b5cbb50868fca52cfcaf4d573bd43552f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page