Skip to main content

CLI tool to convert code folders into a single text file for LLM prompts

Project description

code-to-prompt

PyPI version Python 3.10+

Turn a codebase into a single, clean, LLM-ready text prompt.

code-to-prompt recursively walks a code folder, filters out noise (build artifacts, dependencies, binaries), and outputs a single text file with relative file paths and fenced code blocks for direct copy-pasting into LLM chatbots like ChatGPT, Claude and Gemini.

Local filesystem processing. No AI or network calls.

How It Works

Visual overview

┌────────────────────────────────────────────┐
│              Codebase / Folder             │
│                                            │
│  src/                                      │
│   ├─ main.py                               │
│   ├─ utils.py                              │
│   ├─ __pycache__/        (ignored)         │
│   │    └─ utils.cpython-312.pyc            │
│   ├─ config.yaml                           │
│   └─ api/                                  │
│       └─ handlers.py                       │
│                                            │
│  .git/                   (ignored)         │
│  .gitignore              (ignored)         │
│  node_modules/           (ignored)         │
│                                            │
└────────────────────────────────────────────┘
                    │
                    ▼
┌────────────────────────────────────────────┐
│              code-to-prompt                │
│                                            │
│  • Walk folders                            │
│  • Skip noise                              │
│  • Read code files                         │
│                                            │
└────────────────────────────────────────────┘
                    │
                    ▼
┌────────────────────────────────────────────┐
│              Single text output            │
│                                            │
│  src/main.py                               │
│  ```                                       │
│  ...                                       │
│  ```                                       │
│                                            │
│  src/utils.py                              │
│  ```                                       │
│  ...                                       │
│  ```                                       │
│                                            │
│  src/config.yaml                           │
│  ```                                       │
│  ...                                       │
│  ```                                       │
│                                            │
│  api/handlers.py                           │
│  ```                                       │
│  ...                                       │
│  ```                                       │
│                                            │
└────────────────────────────────────────────┘
                    │
                    ▼
┌────────────────────────────────────────────┐
│              Paste into LLM                │
│        (ChatGPT / Claude / Gemini)         │
└────────────────────────────────────────────┘

Installation

Using pip:

pip install code-to-prompt-cli

Usage

code-to-prompt ./my-project                                 # outputs to .code-to-prompt/ in the current directory
code-to-prompt ./my-project -o output.txt                   # custom output filename
code-to-prompt ./my-project -s tests -s src/__init__.py     # skip files or folders
code-to-prompt ./my-project --tokens                        # estimate total output tokens (no output file is written)
code-to-prompt --help                                       # show all options

What This Is For

  • Share your codebase with an LLM for debugging, review, or discussion
  • Skip the manual copy-paste and get a clean prompt in one command

Features

  • Fenced code blocks optimized for LLM consumption.
  • File paths are resolved relative to the current working directory. This avoids leaking machine-specific paths into LLM prompts.
  • Auto-skips noise: binaries, build artifacts, .git, node_modules, __pycache__, etc.
  • Optional token count estimation for LLM context awareness (--tokens)
  • Deterministic ordering: files are sorted by normalized relative paths, guaranteeing stable and reproducible output across runs

Guarantees

  • Read-only operation — your codebase is never modified
  • Overwrite-safe — output is written only to a dedicated .code-to-prompt/ directory
  • No accidental data loss — source files are never overwritten
  • Local-only execution — no network calls or AI usage
  • Path hygiene — machine-specific absolute paths are never leaked into prompts

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

code_to_prompt_cli-0.2.3.tar.gz (8.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

code_to_prompt_cli-0.2.3-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file code_to_prompt_cli-0.2.3.tar.gz.

File metadata

  • Download URL: code_to_prompt_cli-0.2.3.tar.gz
  • Upload date:
  • Size: 8.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.3 cpython/3.12.10 HTTPX/0.28.1

File hashes

Hashes for code_to_prompt_cli-0.2.3.tar.gz
Algorithm Hash digest
SHA256 e6e22daa4ca6a160b68a330d243efbd7ebe2d10bb12939009704b896a80370a0
MD5 480f84e8ab7b46d4df883fb53adabde6
BLAKE2b-256 b2cf72742aa09f10ce07b6b2c03fbdacfb25192c6094aab196e1bacf56d10419

See more details on using hashes here.

File details

Details for the file code_to_prompt_cli-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: code_to_prompt_cli-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 8.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.3 cpython/3.12.10 HTTPX/0.28.1

File hashes

Hashes for code_to_prompt_cli-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a4912aace4a5978596e26fdee11589e3cf54943fc38935f3688878dbab08e6ae
MD5 570bb34f2874d11d8d40346d9f8413bd
BLAKE2b-256 e2462b11b9f82ae889ec79ba385af9c628e6c351f9680f1c99fcfab287f1ae1c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page