Skip to main content

🚀 Auto-generate beautiful README files from the terminal using GPT LLM APIs 💫

Project description

README-AI

Auto-generate informative README.md files using GPT language model APIs 👾

GitHub Workflow Status (with event) Codecov PyPI version PyPI python versions License: MIT


🔗 Quick Links


📍 Overview

Objective

Developer tool that automatically generates detailed README.md files using GPT language model APIs. Simply provide a repository URL or local directory path to your source code, and README-AI handles the documentation for you!

Motivation

Streamlines documentation creation and maintenance, enhancing developer productivity. README-AI aims to enable all skill levels, across all domains, to better understand, use, and contribute to open-source projects.

[!IMPORTANT]

README-AI is currently under development with an opinionated configuration and setup. It is vital to review all text generated by the LLM APIs to ensure it accurately represents your project.


🤖 Demo

README-AI CLI: Standard usage of the tool using the terminal.

readmeai-cli-demo

README-AI CLI Offline Mode: You can also run the CLI without an API key by passing the --offline flag.

readmeai-cli-offline-demo

[!TIP]

Offline mode is useful for quickly generating a boilerplate README without incurring API usage costs. The README file created in the video above can be viewed here.


📦 Features

❶ Badge Icons

Project Slogan and Badges

‣ A slogan to highlight your poject is generated by prompting OpenAI's GPT engine.

‣ Codebase dependencies and metadata are visualized using Shields.io badges.

badges

‣ Use the CLI option --badges to select the style of badges for your README!
‣ 6 options currently supported: flat (default), flat-square, plastic, for-the-badge, social, square. Find a few examples below.

1. Shieldsio flat badge style

Command: none as its the default style for readme-ai

badges-shieldsio-default

2. Shieldsio for-the-badge style

Command: --badges for-the-badge

badges-shieldsio-flat

3. Square iOS style badges

Command: --badges square

badges-square

❷ Code Documentation

Directory Tree and File Summaries

‣ Your project's directory structure is visualized using a custom tree function.

‣ Each file in the codebase is summarized by OpenAI's GPT model.

repository-tree code-summaries

❸ Features Table

Prompted Text Generation

‣ An overview paragraph and features table are generated using detailed prompts, embedded with project metadata.

feature-table

❹ Dynamic Quick Start Section

Installation, Running, and Test

‣ Generates instructions for installing, running, and testing your project. Instructions are created by identifying the codebase's top language and referring to our language_setup.toml configuration file.

usage-instructions

❺ Additional README Sections

roadmap
license

❻ Templates (coming soon)

‣ Developing CLI option letting users select from a variety of README styles

‣ Templates for use-cases such as data, machine learning, web development, and more!

AI & ML README Template Concept


❼ Example README Files
Output File Repository Languages
1️⃣ readme-python.md readme-ai Python
2️⃣ readme-typescript.md chatgpt-app-react-typescript TypeScript, React
3️⃣ readme-javascript.md (repository deleted) JavaScript, React
4️⃣ readme-kotlin.md file.io-android-client Kotlin, Java, Android
5️⃣ readme-rust-c.md rust-c-app C, Rust
6️⃣ readme-go.md go-docker-app Go
7️⃣ readme-java.md java-minimal-todo Java
8️⃣ readme-fastapi-redis.md async-ml-inference Python, FastAPI, Redis
9️⃣ readme-mlops.md mlops-course Python, Jupyter
🔟 readme-pyflink.md flink-flow PyFlink

Return


🚀 Getting Started

Prerequisites

Ensure that you have the following dependencies installed on your system.

  • Python 3.9, 3.10, 3.11, or 3.12
  • Package manager or container runtime (pip, docker, conda, etc.)
  • OpenAI API account and API key (Additional API endpoints coming soon!)

Code Repository

A remote repository URL or local directory path to your project is required to generate a README file. The following repository sources are currently supported.

  • GitHub
  • GitLab
  • Bitbucket
  • File System

OpenAI API

An OpenAI API account and API key are needed to use readme-ai. The following steps outline the process.

🔐 OpenAI API Account Setup
  1. Go to the OpenAI website.
  2. Click the "Sign up for free" button.
  3. Fill out the registration form with your information and agree to the terms of service.
  4. Once logged in, click on the "API" tab.
  5. Follow the instructions to create a new API key.
  6. Copy the API key and keep it in a secure place.

Additionally, it is essential to understand the potential risks and costs associated with using LLM APIs.

[!WARNING]

Please review the following information before using readme-ai.

  1. Review Sensitive Information: Before running readme-ai, ensure all content in your repository is free of sensitive information. This tool does not remove sensitive data from your codebase, nor from the generated README file. Additionally, readme-ai does not alter your codebase, and its your responsibility to review your codebase before generating a README, as well as reviewing the generated README file before publishing it.

  2. API Usage Costs: The OpenAI API is not free. You will be charged for each request made by README-AI. Costs can accumulate rapidly, so please monitor your API usage and associated costs by visiting the OpenAI API Usage Dashboard.

  3. Paid Account Recommended: Setting up a paid OpenAI account is highly recommended. Not using a payment method will restrict your API usage to OpenAI's base models. This may result in less accurate README file content and potential errors arising. See the OpenAI Pricing Page for more details.

Return


⚙️ Installation

Using pip

pip install readmeai

Using docker

docker pull zeroxeli/readme-ai:latest

Using conda

conda install -c conda-forge readmeai

Alternatively, clone the readme-ai repository and build from source.

git clone https://github.com/eli64s/readme-ai && \
cd readme-ai

Then use one of the methods below to install the project's dependencies (Bash, Conda, Pipenv, or Poetry).

Using bash

bash setup/setup.sh

Using pipenv

pipenv install && \
pipenv shell

Using poetry

poetry install && \
poetry shell

👩‍💻 Running README-AI

Before running the application, ensure that you have an OpenAI API key and that it is set as an environment variable.

On Linux or MacOS

export OPENAI_API_KEY=YOUR_API_KEY

On Windows

set OPENAI_API_KEY=YOUR_API_KEY

Use one of the methods below to run the application (Pip, Docker, Conda, Streamlit, etc).

Using pip

readmeai --repository https://github.com/eli64s/readme-ai

Using docker

docker run -it \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
-r https://github.com/eli64s/readme-ai

Using conda

readmeai -r https://github.com/eli64s/readme-ai

Using streamlit

Streamlit App

[!NOTE]

The web app is hosted on Streamlit Community Cloud, a free service for sharing Streamlit apps. Thus, the app may be unstable or unavailable at times. See the readme-ai-streamlit repository for more details.

Alternatively, run the application locally from the cloned repository.

Using pipenv

pipenv shell && \
python3 -m readmeai.cli.commands -o readme-ai.md -r https://github.com/eli64s/readme-ai

Using poetry

poetry shell && \
poetry run python3 -m readmeai.cli.commands -o readme-ai.md -r https://github.com/eli64s/readme-ai

See the Configuration section below for the complete list of CLI options and settings.


🧪 Tests

Use pytest to run the default test suite.

make test

Use nox to run the test suite against multiple Python versions including (3.9, 3.10, 3.11, 3.12).

nox -f noxfile.py

🧩 Configuration

Run the readmeai command in your terminal with the following options to tailor your README file.

CLI Options

Flag (Long/Short) Default Description Type Status
--align/-a center Set header text alignment (left, center). String Optional
--api-key/-k OPENAI_API_KEY env var Your GPT model API key. String Optional
--badges/-b flat Badge style options for your README file. String Optional
--emojis/-e False Add emojis to section header tiles. Boolean Optional
--image/-i blue Project logo image displayed in README header. String Optional
--max-tokens 3999 Max number of tokens that can be generated. Integer Optional
--model/-m gpt-3.5-turbo Select GPT model for content generation. String Optional
--offline False Generate a README without an API key. Boolean Optional
--output/-o readme-ai.md README output file name. Path/String Optional
--repository/-r None Repository URL or local path. URL/String Required
--temperature/-t 1.0 LLM API creativity level. Float Optional
--template None Choose README template. String WIP
--language/-l English (en) Language for content. String WIP

WIP = work in progress, or feature currently under development.
For additional command-line information, run readmeai --help in your terminal for more details about each option.

Badge Icons

Select your preferred badge icon style for your README file using the --badges flag. The following options are currently supported.

Badge Preview
flat flat
flat-square flat-square
for-the-badge for-the-badge
plastic plastic
skills Skills
skills-light Skills-Light
social social

Project Logo

Select an image to display in your README header section using the --image flag. The following options are currently supported.

Image Preview
custom Provide a custom image URL.
black readme-blue
blue default
gradient gradient
purple purple
yellow yellow

Custom Settings

The readme-ai tool is designed with flexibility in mind, allowing users to configure various aspects of its operation through a series of models and settings. The configuration file covers aspects such as language model settings, git host providers, repository details, markdown templates, and more.

🔠 Configuration Models

GitService Enum

  • Purpose: Defines Git service details.
  • Attributes:
    • LOCAL, GITHUB, GITLAB, BITBUCKET: Enumerations for different Git services.
    • host: Service host URL.
    • api_url: API base URL for the service.
    • file_url: URL format for accessing files in the repository.

BadgeOptions Enum

  • Purpose: Provides options for README file badge icons.
  • Options: FLAT, FLAT_SQUARE, FOR_THE_BADGE, PLASTIC, SKILLS, SKILLS_LIGHT, SOCIAL.

ImageOptions Enum

  • Purpose: Lists CLI options for README file header images.
  • Options: CUSTOM, BLACK, BLUE, GRADIENT, PURPLE, YELLOW.

CliSettings

  • Purpose: Defines CLI options for the application.
  • Fields:
    • emojis: Enables or disables emoji usage.
    • offline: Indicates offline mode operation.

FileSettings

  • Purpose: Configuration for various file paths used in the application.
  • Fields: dependency_files, identifiers, ignore_files, language_names, language_setup, output, shields_icons, skill_icons.

GitSettings

  • Purpose: Manages repository settings and validations.
  • Fields:
    • repository: The repository URL or path.
    • source: The source of the Git repository.
    • name: The name of the repository.

LlmApiSettings

  • Purpose: Holds settings for OpenAI's LLM API.
  • Fields: content, endpoint, encoding, model, rate_limit, temperature, tokens, tokens_max.

MarkdownSettings

  • Purpose: Contains Markdown templates for different sections of a README.
  • Fields: Templates for aligning text, badges, headers, images, features, getting started, overview, tables of contents, etc.

PromptSettings

  • Purpose: Configures prompts for OpenAI's LLM API.
  • Fields: features, overview, slogan, summaries.

AppConfig

  • Purpose: Nested model encapsulating all application configurations.
  • Fields: cli, files, git, llm, md, prompts.

AppConfigModel

  • Purpose: Pydantic model for the entire application configuration.
  • Sub-models: AppConfig.

ConfigHelper

  • Purpose: Assists in loading additional configuration files.
  • Methods: load_helper_files to load configuration from different files.

Functions

_get_config_dict

  • Purpose: Retrieves configuration data from TOML files.
  • Parameters:
    • handler: Instance of FileHandler.
    • file_path: Path to the configuration file.

load_config

  • Purpose: Loads the main configuration file.
  • Parameters:
    • path: Path to the configuration file.
  • Returns: An instance of AppConfig.

load_config_helper

  • Purpose: Loads multiple configuration helper files.
  • Parameters:
    • conf: An instance of AppConfigModel.
  • Returns: An instance of ConfigHelper.

Usage

The configurations are loaded using the load_config function, which parses a TOML file into the AppConfigModel. This model is then used throughout the application to access various settings. Additional helper files can be loaded using ConfigHelper, which further enriches the application's configuration context.

Return


🛠 Project Roadmap

  • Publish readme-ai CLI as a Python package on PyPI.
  • Containerize the readme-ai CLI as a Docker image via Docker Hub.
  • Serve the readme-ai CLI as a web app, deployed on Streamlit Community Cloud.
  • Integrate singular interface for all LLM API providers (Anthropic, Cohere, Gemini, etc.)
  • Design template system to give users a variety of README document flavors (ai, data, web, etc.)
  • Develop robust documentation generation process that can extend to full project docs (i.e. Sphinx, MkDocs, etc.)
  • Add support for generating README files in any language (i.e. CN, ES, FR, JA, KO, RU).
  • Create GitHub Actions script to automatically update README file content on repository push.

📒 Changelog

Changelog


🤝 Contributing


📄 License

MIT


👏 Acknowledgments

Badges

Return


Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

readmeai-0.4.96.tar.gz (105.7 kB view hashes)

Uploaded Source

Built Distribution

readmeai-0.4.96-py3-none-any.whl (108.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page