Skip to main content

🚀 Generate beautiful README.md files from the terminal. Powered by OpenAI's GPT LLMs 💫

Project description

zh-CN fr


README-AI

◦ Generate beautiful and informative README files

◦ Developed with OpenAI's GPT language model APIs


Markdown OpenAI Python pytest Docker actions

pypi-version pypi-python-version pypi-downloads github-license

📖 Table of Contents


📍 Overview

README-AI is a powerful command-line tool that generates robust README.md files for your software and data projects. By simply providing a remote repository URL or path to your codebase, this tool auto-generates documentation for your entire project, leveraging the capabilities OpenAI's GPT language model APIs.

🎯 Motivation

Simplifies the process of writing and maintaining high-quality project documentation, enhancing developer productivity and workflow. The ultimate goal of readme-ai is to improve the adoption and usability of open-source software, enabling all skill levels to better understand complex codebases and easily use open-source tools.

⚠️ Disclaimer

This project is currently under development and has an opinionated configuration. While readme-ai provides an excellent starting point for documentation, its important to review all text generated by the OpenAI API to ensure it accurately represents your codebase.


🎈 Demos

Command-Line Interface

‣ Run readme-ai in your terminal via PyPI, Docker, and more!

cli-demo


Streamlit Community Cloud

‣ Use readme-ai directly in your browser! Zero installation, zero code!

streamlit-demo


🤖 Features


▶ ❶ Project Badges

Project Slogan and Badges

‣ A slogan to highlight your poject is generated by prompting OpenAI's GPT engine.

‣ Codebase dependencies and metadata are visualized using Shields.io badges.

badges

▶ ❷ Codebase Documentation

Directory Tree and File Summaries

‣ Your project's directory structure is visualized using a custom tree function.

‣ Each file in the codebase is summarized by OpenAI's GPT model.

repository-tree code-summaries

▶ ❸ Overview and Features Table

Prompted Text Generation

‣ An overview paragraph and features table are generated using detailed prompts, embedded with project metadata.

feature-table

▶ ❹ Dynamic Usage Instructions

Installation, Running, and Test

‣ Generates instructions for installing, running, and testing your project. Instructions are created by identifying the codebase's top language and referring to our language_setup.toml configuration file.

usage-instructions

▶ ❺ Contributing Guidelines and more!

roadmap
license

▶ ❻ Custom Templates - coming soon
  • Developing CLI option letting users select from a variety of README styles
  • Templates for use-cases such as data, machine learning, web development, and more!

▶ ❼ Example README Files
Output File Repository Languages
1️⃣ readme-python.md readme-ai Python
2️⃣ readme-typescript.md chatgpt-app-react-typescript TypeScript, React
3️⃣ readme-javascript.md assistant-chat-gpt-javascript JavaScript, React
4️⃣ readme-kotlin.md file.io-android-client Kotlin, Java, Android
5️⃣ readme-rust-c.md rust-c-app C, Rust
6️⃣ readme-go.md go-docker-app Go
7️⃣ readme-java.md java-minimal-todo Java
8️⃣ readme-fastapi-redis.md async-ml-inference Python, FastAPI, Redis
9️⃣ readme-mlops.md mlops-course Python, Jupyter
🔟 readme-pyflink.md flink-flow PyFlink

🔝 Return


👩‍💻 Usage

Dependencies

Please ensure you have the following dependencies installed on your system:

  • Python version 3.9 or higher
  • Package manager (i.e. pip, conda, poetry) or Docker
  • OpenAI API paid account and API key

Repository

A remote repository URL or path to your local project's directory is needed to use readme-ai. The following repository types are currently supported:

  • GitHub
  • GitLab
  • File System

OpenAI API

An OpenAI API account and API key are needed to use readme-ai. The steps below outline this process:

🔐 OpenAI API - Setup Instructions
  1. Go to the OpenAI website.
  2. Click the "Sign up for free" button.
  3. Fill out the registration form with your information and agree to the terms of service.
  4. Once logged in, click on the "API" tab.
  5. Follow the instructions to create a new API key.
  6. Copy the API key and keep it in a secure place.
⚠️ OpenAI API - Cautionary Guidelines
  1. Review Sensitive Information: Before running the application, ensure that all content in your repository is free of sensitive information. Please note that readme-ai does not filter out sensitive data from the README file, and it does not modify any files in your repository.

  2. API Usage Costs: The OpenAI API is not free, and you will be charged for each request made. Costs can accumulate rapidly, so it's essential to be aware of your usage. You can monitor your API usage and associated costs by visiting the OpenAI API Usage Dashboard.

  3. Paid Account Recommended: Setting up a paid account with OpenAI is highly recommended to avoid potential issues. Without a payment method on file, your API usage will be restricted to base GPT-3 models. This limitation can result in less accurate README file generation and may lead to API errors due to request limits.

  4. Runtime Considerations: README file generation typically takes less than a minute. If the process exceeds a few minutes (e.g., 3 minutes), it's advisable to terminate readme-ai to prevent extended processing times.


🛠 Installation

Using Pip

Pip is the recommended installation method for most users.

pip install --upgrade readmeai

Using Docker

Docker is recommended for users wanting to run the application in a containerized environment.

docker pull zeroxeli/readme-ai:latest

Manually Install

1️⃣ Clone the readme-ai repository.

git clone https://github.com/eli64s/readme-ai

2️⃣ Navigate to readme-ai directory.

cd readme-ai

3️⃣ Install dependencies using a method below.

Using Bash

bash setup/setup.sh

Using Conda

conda create -n readmeai python=3.9 -y && \
conda activate readmeai && \
pip install -r requirements.txt

Using Poetry

poetry install

⚙️ Configuration


Command-Line Arguments

To generate a README.md file, use the readmeai command in your terminal, along with the arguments below.

Short Flag Long Flag Description Status
-k --api-key Your OpenAI API secret key. Optional
-c --encoding Encodings specify how text is converted into tokens. Optional
-e --engine OpenAI GPT language model engine (gpt-3.5-turbo) Optional
-f --offline-mode Run offline without calling the OpenAI API. Optional
-o --output The output path for your README.md file. Optional
-r --repository The URL or path to your code repository. Required
-t --temperature The temperature (randomness) of the model Optional
-l --language The language of text written in the README file. Coming Soon!
-s --style The README template format to use. (coming soon!) Coming Soon!

Custom Settings

To customize the README file generation process, you can modify the following sections of the configuration file:

  • api - OpenAI language model API configuration settings.
  • git - Default git repository settings used if no repository is provided.
  • paths - Directory paths and files used by the readme-ai application.
  • prompts - Large language model prompts used to generate the README file.
  • md - Dynamic Markdown section code templates used to build the README file.

🚀 Running README-AI


Using Streamlit

Use the app directly in your browser via Streamlit Community Cloud.


Using Pip

# Option 1: Run readmeai command with all required command-line arguments.
readmeai --api-key "YOUR_API_KEY" --output readme-ai.md --repository https://github.com/eli64s/readme-ai
# Option 2: Run readmeai command with OpenAI API key set as environment variable.
export OPENAI_API_KEY="YOUR_API_KEY"
readmeai -o readme-ai.md -r https://github.com/eli64s/readme-ai

Using Docker

# Option 1: Run Docker container with all required command-line arguments.
docker run -it \
-e OPENAI_API_KEY="YOUR_API_KEY" \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
readmeai -o readme-ai.md -r https://github.com/eli64s/readme-ai
# Option 2: Run Docker container with OpenAI API key set as environment variable.
export OPENAI_API_KEY="YOUR_API_KEY"
docker run -it \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
readmeai -o readme-ai.md -r https://github.com/eli64s/readme-ai

Manually Run

Using Conda

conda activate readmeai
export OPENAI_API_KEY="YOUR_API_KEY"
python readmeai/main.py -o readme-ai.md -r https://github.com/eli64s/readme-ai

Using Poetry

poetry shell
export OPENAI_API_KEY="YOUR_API_KEY"
poetry run python readmeai/main.py -o readme-ai.md -r https://github.com/eli64s/readme-ai

🧪 Tests

Execute the test suite using the command below.

bash scripts/test.sh

🛣 Roadmap

  • Publish project as a Python library via PyPI and a Docker image on Docker Hub.
  • Integrate and deploy app with Streamlit to provide a simple user-interface for using the tool.
  • Develop GitHub Actions script to automatically update the README file when new code is pushed.
  • Design README output templates for a variety of use-cases (i.e. data, web-dev, minimal, etc.)
  • Add support for generating README files in any language (i.e. CN, ES, FR, JA, KO, RU).

📒 Changelog

Changelog


🤝 Contributing

Contributing Guidelines


📄 License

MIT


👏 Acknowledgments

Badges

🔝 Return


Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

readmeai-0.3.92.tar.gz (88.9 kB view hashes)

Uploaded Source

Built Distribution

readmeai-0.3.92-py3-none-any.whl (86.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page