Skip to main content

Automated README file generator, powered by AI.

Project description

readme-ai-banner-logo

Designed for simplicity, customization, and developer productivity.

github-actions codecov pypi-version pepy-total-downloads license


🔗 Quick Links

  1. Overview
  2. Demo
  3. Features
  4. Getting Started
  5. Configuration
  6. Examples
  7. Contributing

[!IMPORTANT] ✨ Visit the Official Documentation for detailed guides and tutorials.


🔮 Overview

README-AI is a developer tool that automatically generates README markdown files using a robust repository processing engine and advanced language models. Simply provide a URL or path to your codebase, and a well-structured and detailed README will be generated.

Why README-AI?

This tool is designed to streamline the documentation process for developers, saving time and effort while ensuring high-quality README files. Key benefits include:

  • AI-Powered: Leverage language models for intelligent content generation.
  • Consistency: Ensure clean, standardized documentation across projects.
  • Customization: Tailor the output to fit your project's requirements.
  • Language Agnostic: Works with most programming languages/frameworks.
  • Save Time: Generate comprehensive READMEs in less than a minute.

👾 Demo

Running from the command line:

readmeai-cli-demo

Running directly in your browser:

readmeai-streamlit-demo


☄️ Features

  • 🚀 Automated Documentation: Generate comprehensive README files automatically from your codebase.
  • 🎨 Customizable Output: Tailor the styling, formatting, badges, header designs, and more preferences.
  • 🌐 Language Agnostic: Compatible with a wide range of programming languages and project types.
  • 🤖 Multi-LLM Support: Current support for OpenAI, Ollama, Anthropic, Google Gemini.
  • 📑 Offline Mode: Create boilerplate README files offline, without any external API calls.
  • 📝 Best Practices: Ensures clean, professional documentation, adhering to markdown best practices.

Let's take a look at some possible customizations created by readme-ai:

custom-dragon-project-logo
--image custom --badge-color FF4B4B --badge-style flat-square --header-style classic

compact-readme-header
--image cloud --header-style compact --toc-style fold

svg-
--badge-style for-the-badge --header-style svg
readme-header-with-cloud-logo
--align left --badge-style flat-square --image cloud
readme-header-with-gradient-markdown-logo
--align left --badge-style flat --image gradient
custom-balloon-project-logo
--badge-style flat --image custom
readme-header-with-skill-icons-light
--badge-style skills-light --image grey
readme-header-with-blue-markdown-logo
--badge-style flat-square
readme-header-with-black-readme-logo
--badge-style flat --image black
custom-database-project-logo
--image custom --badge-color 00ffe9 --badge-style flat-square --header-style classic
llm-generated-project-logo
--image llm --badge-style plastic --header-style classic
readme-header-style-modern
--image custom --badge-color BA0098 --badge-style flat-square --header-style modern --toc-style fold
ascii-readme-header-style
--header-style ascii
ascii-box-readme-header-style
--header-style ascii_box

See the Configuration section for a complete list of CLI options.

Additional Generated Sections:

📍 Overview
Overview

◎ High-level introduction of the project, focused on the value proposition and use-cases, rather than technical aspects.

readme-overview-section
✨ Features
Features Table

◎ Generated markdown table that highlights the key technical features and components of the codebase. This table is generated using a structured prompt template.

readme-features-section
📃 Codebase Documentation
Directory Tree

◎ The project's directory structure is generated using pure Python and embedded in the README. See readmeai.generators.tree. for more details.

directory-tree
File Summaries

◎ Summarizes key modules of the project, which are also used as context for downstream prompts.

file-summaries
🚀 Quickstart Instructions
Getting Started Guides

◎ Prerequisites and system requirements are extracted from the codebase during preprocessing. The parsers handles the majority of this logic currently.

getting-started-section-prerequisites
Installation Guide

Installation, Usage, and Testing guides are generated based on the project's dependency files and codebase configuration.

getting-started-section-usage-and-testing
🔰 Contributing Guidelines
Contributing Guide

◎ Dropdown section that outlines general process for contributing to your project.

◎ Provides links to your contributing guidelines, issues page, and more resources.

◎ Graph of contributors is also included.

contributing-guidelines-section
Additional Sections

Project Roadmap, Contributing Guidelines, License, and Acknowledgements are included by default.

footer-readme-section

🛸 Getting Started

System Requirements

  • Python: 3.9+
  • Package Manage/Container: pip, pipx, or docker.

Supported Sources

The following git hosting services are supported for source code retrieval, along with your local file system:

Supported LLM APIs

To enable the full functionality of readmeai, an account and API key are required for one of the following providers:

  • OpenAI: Recommended for general use. Requires an OpenAI account and API key.
  • Ollama: Free and open-source. No API key required.
  • Anthropic: Requires an Anthropic account and API key.
  • Google Gemini: Requires a Google Cloud account and API key.
  • Offline Mode: Generates a boilerplate README without making API calls.

For more information on setting up an API key, refer to the provider's documentation.

⚙️ Installation

Choose your preferred installation method:

 Pip

 pip install readmeai

 Pipx

 pipx install readmeai

[!TIP] Using pipx allows you to install and run Python command-line applications in isolated environments, which helps prevent dependency conflicts with other Python projects.

 Docker

Pull the latest Docker image from the Docker Hub repository.

 docker pull zeroxeli/readme-ai:latest

 From source

Click to expand instructions
  1. Clone the repository:

     git clone https://github.com/eli64s/readme-ai
    
  2. Navigate to the readme-ai directory:

     cd readme-ai
    
  3. Install dependencies:

     pip install -r setup/requirements.txt
    

Alternatively, the project can be setup using the bash script below:

 Bash

  1. Run the setup script:

     bash setup/setup.sh
    

Or, use poetry to build the project:

 Poetry

  1. Install dependencies using Poetry:

     poetry install
    

[!IMPORTANT] To use the Anthropic and Google Gemini clients, additional dependencies are required. See the following installation commands:

  • Anthropic:
     pip install "readmeai[anthropic]"
    
  • Google Gemini:
     pip install "readmeai[google-generativeai]"
    

🤖 Running the CLI

1. Set Up Environment Variables

With OpenAI:

 export OPENAI_API_KEY=<your_api_key>

# Or for Windows users: set OPENAI_API_KEY=<your_api_key>
Additional Providers (Ollama, Anthropic, Google Gemini)
Ollama

Refer to the Ollama documentation for more information on setting up the Ollama API. Here is a basic example:

  1. Pull your model of choice from the Ollama repository:

     ollama pull mistral:latest
    
  2. Start the Ollama server and set the OLLAMA_HOST environment variable:

     export OLLAMA_HOST=127.0.0.1 && ollama serve
    
Anthropic
  1. Export your Anthropic API key:

     export ANTHROPIC_API_KEY=<your_api_key>
    
Google Gemini
  1. Export your Google Gemini API key:

     export GOOGLE_API_KEY=<your_api_key
    

2. Generate a README

Run the following command, replacing the repository URL with your own:

 readmeai --repository https://github.com/eli64s/readme-ai --api openai

[!IMPORTANT] By default, the gpt-3.5-turbo model is used. Higher costs may be incurred when more advanced models.

Run with Ollama and set llama3 as the model:

 readmeai --api ollama --model llama3 --repository https://github.com/eli64s/readme-ai

Run with Anthropic:

 readmeai --api anthropic -m claude-3-5-sonnet-20240620 -r https://github.com/eli64s/readme-ai

Run with Google Gemini:

 readmeai --api gemini -m gemini-1.5-flash -r https://github.com/eli64s/readme-ai

Use a local directory path:

readmeai --repository /path/to/your/project

Add more customization options:

 readmeai --repository https://github.com/eli64s/readme-ai \
           --output readmeai.md \
           --api openai \
           --model gpt-4 \
           --badge-color A931EC \
           --badge-style flat-square \
           --header-style compact \
           --toc-style fold \
           --temperature 0.9 \
           --tree-depth 2
           --image LLM \
           --emojis

 Docker

Run the Docker container with the OpenAI client:

 docker run -it --rm \
	-e OPENAI_API_KEY=$OPENAI_API_KEY \
	-v "$(pwd)":/app zeroxeli/readme-ai:latest \
	-r https://github.com/eli64s/readme-ai \
	--api openai

 From source

Click to expand instructions

 Bash

If you installed the project from source with the bash script, run the following command:

  1. Activate the virtual environment:

     conda activate readmeai
    
  2. Run the CLI:

     python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
    

 Poetry

  1. Activate the virtual environment:

     poetry shell
    
  2. Run the CLI:

     poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai
    

 Streamlit

Try readme-ai directly in your browser, no installation required. See the readme-ai-streamlit repository for more details.


🧪 Testing

The pytest and nox frameworks are used for development and testing.

Install the dependencies using Poetry:

 poetry install --with dev,test

Run the unit test suite using Pytest:

 make test

Run the test suite against Python 3.9, 3.10, 3.11, and 3.12 using Nox:

 make test-nox

[!TIP] Nox is an automation tool that automates testing in multiple Python environments. It is used to ensure compatibility across different Python versions.


🔡 Configuration

Customize your README generation using these CLI options:

Option Description Default
--align Text alignment in header center
--api LLM API service provider offline
--badge-color Badge color name or hex code 0080ff
--badge-style Badge icon style type flat
--header-style Header template style classic
--toc-style Table of contents style bullet
--emojis Adds emojis to the README header sections False
--image Project logo image blue
--model Specific LLM model to use gpt-3.5-turbo
--output Output filename readme-ai.md
--repository Repository URL or local directory path None
--temperature Creativity level for content generation 0.1
--tree-depth Maximum depth of the directory tree structure 2

For a full list of options, run:

readmeai --help

Visit the Official Documentation for more detailed information on configuration options, examples, and best practices.


🎨 Examples

Language/Framework Output File Input Repository Description
Python readme-python.md readme-ai Core readme-ai project
TypeScript & React readme-typescript.md ChatGPT App React Native ChatGPT app
PostgreSQL & DuckDB readme-postgres.md Buenavista Postgres proxy server
Kotlin & Android readme-kotlin.md file.io Client Android file sharing app
Streamlit readme-streamlit.md readme-ai-streamlit Streamlit UI for readme-ai app
Rust & C readme-rust-c.md CallMon System call monitoring tool
Docker & Go readme-go.md docker-gs-ping Dockerized Go app
Java readme-java.md Minimal-Todo Minimalist todo Java app
FastAPI & Redis readme-fastapi-redis.md async-ml-inference Async ML inference service
Jupyter Notebook readme-mlops.md mlops-course MLOps course repository
Apache Flink readme-local.md Local Directory Example using a local directory

Find more examples here.


🏎💨 Roadmap

  • Release readmeai 1.0.0 with enhanced documentation management features.
  • Develop Vscode Extension to generate README files directly in the editor.
  • Develop GitHub Actions to automate documentation updates.
  • Add badge packs to provide additional badge styles and options.
    • Code coverage, CI/CD status, project version, and more.

🔰 Contributing

Contributions are welcome! Please read the Contributing Guide to get started.



🎗 License

README-AI is released under the terms of the MIT License.


🙌 Acknowledgments

⬆️ Top


Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

readmeai-0.5.98.tar.gz (150.3 kB view details)

Uploaded Source

Built Distribution

readmeai-0.5.98-py3-none-any.whl (168.0 kB view details)

Uploaded Python 3

File details

Details for the file readmeai-0.5.98.tar.gz.

File metadata

  • Download URL: readmeai-0.5.98.tar.gz
  • Upload date:
  • Size: 150.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for readmeai-0.5.98.tar.gz
Algorithm Hash digest
SHA256 a93b99acee2b7cd12af0de052c4b371aae98f4433e89c74962af67c78655272e
MD5 1c79b940d7116d84ef101e821c8f689e
BLAKE2b-256 6b1deb85a3e84aa8a6a9d166620370bf042090f13f0d633e8437bd875127c1e3

See more details on using hashes here.

File details

Details for the file readmeai-0.5.98-py3-none-any.whl.

File metadata

  • Download URL: readmeai-0.5.98-py3-none-any.whl
  • Upload date:
  • Size: 168.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for readmeai-0.5.98-py3-none-any.whl
Algorithm Hash digest
SHA256 6040189a3938927b5923885364150c31f04fe10e7e8723ba91ced765191e8f59
MD5 d588c11e7033b802e556496db165dcad
BLAKE2b-256 5012ba9f687f0c073b8c4ba652e71b4cf2c3e3389d49951e7465250e1cd0e0f4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page