Skip to main content

A CLI tool to generate prompts with project structure and file contents

Project description

kopipasta

Version Downloads

A CLI tool for taking full, transparent control of your LLM context. No black boxes.

kopipasta
  • An LLM told me that "kopi" means Coffee in some languages... and a Diffusion model then made this delicious soup.

The Philosophy: You Control the Context

Many AI coding assistants use Retrieval-Augmented Generation (RAG) to automatically find what they think is relevant context. This is a black box. When the LLM gives a bad answer, you can't debug it because you don't know what context it was actually given.

kopipasta is the opposite. I built it for myself on the principle of explicit context control. You are in the driver's seat. You decide exactly what files, functions, and snippets go into the prompt. This transparency is the key to getting reliable, debuggable results from an LLM.

It's a "smart copy" command for your project, not a magic wand.

How It Works

The workflow is dead simple:

  1. Gather: Run kopipasta and point it at the files, directories, and URLs that matter for your task.
  2. Select: The tool interactively helps you choose what to include. For large files, you can send just a snippet or even hand-pick individual functions.
  3. Define: Write your instructions to the LLM in an interactive prompt directly in your terminal.
  4. Paste: The final, comprehensive prompt is now on your clipboard, ready to be pasted into ChatGPT, Gemini, Claude, or your LLM of choice.
  5. Apply: Inside the file selector, press p, paste the LLM's markdown response, and the tool will automatically patch your local files.

Installation

# Using pipx (recommended for CLI tools)
pipx install kopipasta

# Or using standard pip
pip install kopipasta

Usage

kopipasta has two main modes: creating prompts and applying patches.

Creating a Prompt

kopipasta [options] [files_or_directories_or_urls...]

Arguments:

  • [files_or_directories_or_urls...]: One or more paths to files, directories, or web URLs to use as the starting point for your context.

Options:

  • -t TASK, --task TASK: Provide the task description directly on the command line, skipping the editor.

Applying Patches

kopipasta can apply changes suggested by an LLM directly to your codebase, assuming you are in a Git repository.

  1. While running kopipasta in the interactive file selector, press the p key.
  2. Paste the entire markdown response from your LLM into the terminal prompt and submit.
  3. The tool will find code blocks with file paths (e.g., // FILE: src/main.py) and immediately write those changes to your local files.
  4. After applying, use standard Git commands like git diff to review the changes before staging and committing them.

Key Features

  • Total Context Control: Interactively select files, directories, snippets, or even individual functions. You see everything that goes into the prompt.
  • Interactive Code Patcher: Press p in the file selector to paste and apply LLM-suggested changes directly to your local files. Relies on your version control (like Git) for safety, enabling a fast workflow.
  • Transparent & Explicit: No hidden RAG. You know exactly what's in the prompt because you built it. This makes debugging LLM failures possible.
  • Web-Aware: Pulls in content directly from URLs—perfect for API documentation.
  • Safety First:
    • Automatically respects your .gitignore rules.
    • Detects if you're about to include secrets from a .env file and asks what to do.
  • Context-Aware: Keeps a running total of the prompt size (in characters and estimated tokens) so you don't overload the LLM's context window.
  • Developer-Friendly:
    • Provides a rich, interactive prompt for writing task descriptions in terminal.
    • Copies the final prompt directly to your clipboard.
    • Provides syntax highlighting during chunk selection.

A Real-World Example

I had a bug where my setup.py didn't include all the dependencies from requirements.txt.

  1. I ran kopipasta -t "Update setup.py to read dependencies dynamically from requirements.txt" setup.py requirements.txt.
  2. The tool confirmed the inclusion of both files and copied the complete prompt to my clipboard.
  3. I pasted the prompt into my LLM chat window.
  4. I copied the LLM's response (which included a modified setup.py in a markdown code block).
  5. Inside kopipasta, I pressed p, pasted the response, and my local setup.py was updated.
  6. I ran git diff to review the changes, then tested and committed.

No manual file reading, no clumsy copy-pasting, just a clean, context-rich prompt that I had full control over, and a seamless way to apply the results.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kopipasta-0.40.0.tar.gz (35.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kopipasta-0.40.0-py3-none-any.whl (32.5 kB view details)

Uploaded Python 3

File details

Details for the file kopipasta-0.40.0.tar.gz.

File metadata

  • Download URL: kopipasta-0.40.0.tar.gz
  • Upload date:
  • Size: 35.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for kopipasta-0.40.0.tar.gz
Algorithm Hash digest
SHA256 cbab76c2c20407c3d2c108c988558df4732214ac581c03e0dc09a786a53b788e
MD5 0f10ca3f119f4cd3ad787fe6f1c085e3
BLAKE2b-256 c492515faea96ff61b7663987ab5e66577d741900294355ca04d81b98133fb58

See more details on using hashes here.

File details

Details for the file kopipasta-0.40.0-py3-none-any.whl.

File metadata

  • Download URL: kopipasta-0.40.0-py3-none-any.whl
  • Upload date:
  • Size: 32.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for kopipasta-0.40.0-py3-none-any.whl
Algorithm Hash digest
SHA256 846a45af4e4b70b3893cf400cde0ed8fd916a547aa92c3c302c2e778916ec5ed
MD5 3df6573702670db5a7084ae9d188f93a
BLAKE2b-256 abb8d01dddb5aa382f600ecb88550d289a787ad50f018471f4738787162de832

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page