🚀 Auto-generate beautiful README files from the terminal, powered by OpenAI's GPT language models 💫
Project description
README-AI
Auto-generate informative README.md
files using GPT language model APIs 👾
🔗 Quick Links
📍 Overview
Objective
Developer tool that automatically generates detailed README.md
files using GPT language model APIs. Simply provide a repository URL or local directory path to your source code, and README-AI handles the documentation for you!
Motivation
Streamlines documentation creation and maintenance, enhancing developer productivity. README-AI aims to enable all skill levels, across all domains, to better understand, use, and contribute to open-source projects.
[!IMPORTANT]
README-AI is currently under development with an opinionated configuration and setup. It is vital to review all text generated by the LLM APIs to ensure it accurately represents your project.
🤖 Demo
README-AI CLI: Standard usage of the readme-ai CLI tool with an OpenAI API key.
CLI without an API key: You can also run readme-ai without an API key by passing the --offline
flag to the CLI.
[!TIP]
Offline mode is useful for quickly generating a boilerplate README without incurring API usage costs.
Placeholders are left for all content that is LLM generated. This includes the following sections:
The project slogan used in the README header, Overview, Features, and Modules sections.
See an example file generated using the
--offline
flag readme-offline.md.
Streamlit Web App: Generate a README file in your browser with Streamlit. No installation required!
📦 Features
❶ Badge Icons
Project Slogan and Badges‣ A slogan to highlight your poject is generated by prompting OpenAI's GPT engine. ‣ Codebase dependencies and metadata are visualized using Shields.io badges. |
‣ Use the CLI option --badges
to select the style of badges for your README!
‣ 6 options currently supported: flat (default), flat-square, plastic, for-the-badge, social, square. Find a few examples below.
1. Shieldsio flat badge styleCommand: none as its the default style for readme-ai |
2. Shieldsio for-the-badge styleCommand: |
3. Square iOS style badgesCommand: |
❷ Code Documentation
Directory Tree and File Summaries |
|
‣ Your project's directory structure is visualized using a custom tree function. ‣ Each file in the codebase is summarized by OpenAI's GPT model. |
|
❸ Features Table
Prompted Text Generation‣ An overview paragraph and features table are generated using detailed prompts, embedded with project metadata. |
❹ Dynamic Quick Start Section
Installation, Running, and Test‣ Generates instructions for installing, running, and testing your project. Instructions are created by identifying the codebase's top language and referring to our language_setup.toml configuration file. |
❺ Additional README Sections
❻ Templates (coming soon)
‣ Developing CLI option letting users select from a variety of README styles ‣ Templates for use-cases such as data, machine learning, web development, and more! |
AI & ML README Template Concept
|
❼ Example README Files
Output File | Repository | Languages | |
---|---|---|---|
1️⃣ | readme-python.md | readme-ai | Python |
2️⃣ | readme-typescript.md | chatgpt-app-react-typescript | TypeScript, React |
3️⃣ | readme-javascript.md | (repository deleted) | JavaScript, React |
4️⃣ | readme-kotlin.md | file.io-android-client | Kotlin, Java, Android |
5️⃣ | readme-rust-c.md | rust-c-app | C, Rust |
6️⃣ | readme-go.md | go-docker-app | Go |
7️⃣ | readme-java.md | java-minimal-todo | Java |
8️⃣ | readme-fastapi-redis.md | async-ml-inference | Python, FastAPI, Redis |
9️⃣ | readme-mlops.md | mlops-course | Python, Jupyter |
🔟 | readme-pyflink.md | flink-flow | PyFlink |
🚀 Quick Start
Prerequisites
Ensure that you have the following dependencies installed on your system.
- Python 3.9, 3.10, 3.11, or 3.12
- A package manager or container runtime (pip, conda, docker, etc.)
- OpenAI API paid account and API key
Code Repository
A remote repository URL or local directory path to your project is required to generate a README file. The following repository sources are currently supported.
- GitHub
- GitLab
- Bitbucket
- File System
OpenAI API
An OpenAI API account and API key are needed to use readme-ai. The following steps outline the process.
🔐 OpenAI API Account Setup
- Go to the OpenAI website.
- Click the "Sign up for free" button.
- Fill out the registration form with your information and agree to the terms of service.
- Once logged in, click on the "API" tab.
- Follow the instructions to create a new API key.
- Copy the API key and keep it in a secure place.
Additionally, It is essential to understand the potential risks and costs associated with using a GPT language model API.
[!WARNING]
Please review the following information before using readme-ai.
Review Sensitive Information!: Before running README-AI, ensure all content in your repository is free of sensitive information. README-AI DOES NOT filter out sensitive data from your codebase and generated README file. README-AI DOES NOT alter your codebase in any way, and it is your responsibility to review your codebase before generating a README file, as well as after before publishing it.
API Usage Costs: The OpenAI API is not free. Thus, you will be charged for each request made by README-AI. Costs can accumulate rapidly, so it's essential to be aware of your usage. Please monitor your API usage and associated costs by visiting the OpenAI API Usage Dashboard.
Paid Account Recommended: Setting up a paid OpenAI account is highly recommended to avoid potential issues. Without a payment method on file, your API usage will be restricted to base OpenAI's base models. This may result in less accurate README file content and potential errors may occur. For more details, see the OpenAI Pricing Page.
⚙️ Installation
Using pip
pip install readmeai
Using docker
docker pull zeroxeli/readme-ai:latest
Using conda
conda install -c conda-forge readmeai
Alternatively, clone the readme-ai repository and build from source.
git clone https://github.com/eli64s/readme-ai && \
cd readme-ai
Then use one of the methods below to install the project's dependencies (Bash, Conda, Pipenv, or Poetry).
Using bash
bash setup/setup.sh
Using pipenv
pipenv install && \
pipenv shell
Using poetry
poetry install && \
poetry shell
👩💻 Running README-AI
Before running the application, ensure that you have an OpenAI API key and that it is set as an environment variable.
On Linux
or MacOS
export OPENAI_API_KEY=YOUR_API_KEY
On Windows
set OPENAI_API_KEY=YOUR_API_KEY
Use one of the methods below to run the application (Pip, Docker, Conda, Streamlit, etc).
Using pip
readmeai --repository https://github.com/eli64s/readme-ai
Using conda
readmeai -r https://github.com/eli64s/readme-ai
Using docker
docker run -it \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
-r https://github.com/eli64s/readme-ai
Using streamlit
Try readme-ai in your browser using the Streamlit web app. No installation required!
[!NOTE]
Hosted on Streamlit Community Cloud.
Streamlit App may be unavailable at times as this is a free service.
See the readme-ai-streamlit repository for more details.
For the complete list of CLI options and README customization settings, see the Configuration section.
Alternatively, you can run the readme-ai application directly from the source code.
Using pipenv
pipenv shell && \
python3 -m readmeai.cli.commands -o readme-ai.md -r https://github.com/eli64s/readme-ai
Using poetry
poetry shell && \
poetry run python3 -m readmeai.cli.commands -o readme-ai.md -r https://github.com/eli64s/readme-ai
🧪 Tests
Use pytest
to run the default test suite.
make test
Use nox
to run the test suite against multiple Python versions. Runs test against versions 3.9, 3.10, 3.11, 3.12
.
nox -f noxfile.py
[!NOTE]
Nox is a Python automation toolkit that makes it easier to manage multiple testing environments.
See the Nox documentation for more details.
🧩 Configuration
Run the readmeai
command in your terminal with the following options to tailor your README file.
Command-Line Options
Flag (Long/Short) | Default | Description | Type | Status |
---|---|---|---|---|
--align /-a |
center |
Set header text alignment (left , center ). |
String | Optional |
--api-key /-k |
OPENAI_API_KEY env var |
Your GPT model API key. | String | Optional |
--badges /-b |
flat |
Badge style options for your README file. | String | Optional |
--emojis /-e |
False |
Add emojis to section header tiles. | Boolean | Optional |
--image /-i |
default |
Project logo image displayed in README header. | String | Optional |
--model /-m |
gpt-3.5-turbo |
Select GPT model for content generation. | String | Optional |
--offline |
False |
Generate a README without an API key. | Boolean | Optional |
--output /-o |
readme-ai.md |
README output file name. | Path/String | Optional |
--repository /-r |
None | Repository URL or local path. | URL/String | Required |
--temperature /-t |
1.0 |
LLM API creativity level. | Float | Optional |
--template |
None | Choose README template. | String | WIP |
--language /-l |
English (en) |
Language for content. | String | WIP |
WIP = work in progress, or feature currently under development.
For additional command-line information, run readmeai --help
in your terminal for more details about each option.
Badge Icons
Select your preferred badge icon style for your README file using the --badges
flag. The following options are currently supported.
Badge | Preview |
---|---|
flat |
|
flat-square |
|
for-the-badge |
|
plastic |
|
skills |
|
skills-light |
|
social |
Project Logo
Select an image to display in your README header section using the --image
flag. The following options are currently supported.
Image | Preview |
---|---|
custom |
Provide a custom image URL. |
black |
|
blue |
|
gradient |
|
purple |
|
yellow |
Custom Settings
To customize the README file generation process in the readme-ai CLI tool, you can modify the project's configuration file. The configuration file is structured using Pydantic models, which are described below in detail.
🔠 Pydantic Models
ApiConfig
- Description: Pydantic model for OpenAI API configuration.
- Attributes:
endpoint
: The endpoint URL for the OpenAI API.encoding
: Encoding settings for the API.model
: The model of the OpenAI language.rate_limit
: API rate limit.tokens
: Token limit per request.tokens_max
: Maximum token limit.temperature
: Temperature setting for the language model.
CliConfig
- Description: CLI options for the readme-ai application.
- Attributes:
emojis
: Boolean to enable or disable emojis.offline
: Boolean to enable or disable offline mode.
GitConfig
- Description: Configuration for Git repository settings.
- Attributes:
repository
: Repository URL or local path.source
: Source of the repository (GitHub, GitLab, etc.).name
: Optional name for the project.
MarkdownConfig
- Description: Pydantic model for Markdown code blocks that are used to build and format the README file.
- Attributes: Various attributes for markdown formatting like
align
,image
,badges
, etc.
PathsConfig
- Description: Pydantic model for configuration file paths.
- Attributes: Paths for various configuration files like
dependency_files
,ignore_files
,language_names
, etc.
PromptsConfig
- Description: Pydantic model for OpenAI prompts.
- Attributes: Contains strings for features, overview, slogan, and summaries prompts.
AppConfig
- Description: Nested Pydantic model for the entire configuration.
- Sub-Models: Includes all the above configurations as sub-models.
AppConfigModel
- Description: Pydantic model for the entire configuration.
- Attributes: Contains the
AppConfig
as a sub-model.
ConfigHelper
- Description: Helper class to load additional configuration files.
- Functionality: This class extends the base configuration with additional settings loaded from TOML files.
Additional Functions
- load_config: Function to load the main configuration from a TOML file.
- load_config_helper: Function to load multiple configuration helper TOML files.
🚧 Project Roadmap
- Publish project as a Python library via PyPI for easy installation.
- Make project available as a Docker image on Docker Hub.
- Integrate and deploy app with Streamlit to make tool more widely accessible.
- Refactor our large language model engine to enable more robust README generation.
- Explore LangChain 🦜️🔗 as an alternative to using the OpenAI API directly.
- Explore LlamaIndex 🦙 framework and Retrieval Augmented Generation (RAG) paradigm.
- Building template system to create README files for specific use-cases (data, mobile, web, etc.)
- Add support for generating README files in any language (i.e. CN, ES, FR, JA, KO, RU).
- Develop GitHub Actions script to automatically update the README file when new code is pushed.
📒 Changelog
🤝 Contributing
📄 License
👏 Acknowledgments
Badges
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for readmeai-0.4.81-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f8afc2f59b07dc01554d814951fb5ae8773b576b1ef4e86856b049dcc2a03ce7 |
|
MD5 | 5a3a728b8de617b51d39f7205552bdbd |
|
BLAKE2b-256 | 28fb52a782fec990ee0327b3850aadd7e1ba2db0ad29ecfacef2dd071bda8a3e |