A CLI tool to generate prompts with project structure and file contents
Project description
kopipasta
Streamline your interaction with LLMs for coding tasks. kopipasta helps you provide comprehensive context (project structure, file contents, web content) and facilitates an interactive, patch-based workflow. Go beyond TAB TAB TAB and take control of your LLM context.
- An LLM told me that kopi means Coffee in some languages.. and a Diffusion model then made this delicious soup.
Installation
You can install kopipasta using pipx (recommended) or pip:
# Using pipx (recommended)
pipx install kopipasta
# Or using pip
pip install kopipasta
Usage
kopipasta [options] [files_or_directories_or_urls...]
Arguments:
[files_or_directories_or_urls...]: Paths to files, directories, or web URLs to include as context.
Options:
-t TASK,--task TASK: Provide the task description directly via the command line. If omitted (and not using-I), an editor will open for you to write the task.-I,--interactive: Start an interactive chat session with Google's Gemini model after preparing the context. RequiresGOOGLE_API_KEYenvironment variable.
Examples:
-
Generate prompt and copy to clipboard (classic mode):
# Interactively select files from src/, include config.json, fetch web content, # then open editor for task input. Copy final prompt to clipboard. kopipasta src/ config.json https://example.com/api-docs # Provide task directly, include specific files, copy final prompt. kopipasta -t "Refactor setup.py to read deps from requirements.txt" setup.py requirements.txt
-
Start an interactive chat session:
# Interactively select files, provide task directly, then start chat. kopipasta -I -t "Implement the apply_simple_patch function" kopipasta/main.py # Interactively select files, open editor for initial task, then start chat. kopipasta -I kopipasta/ tests/
Workflow
kopipasta is designed to support the following workflow when working with LLMs (like Gemini, ChatGPT, Claude, etc.) for coding tasks:
- Gather Context: Run
kopipastawith the relevant files, directories, and URLs. Interactively select exactly what content (full files, snippets, or specific code chunks/patches) should be included. - Define Task: Provide your coding task instructions, either via the
-tflag or through your default editor. - Interact (if using
-I):kopipastaprepares the context and your task as an initial prompt.- An interactive chat session starts (currently using Google Gemini via
google-genai). - Discuss the task, clarify requirements, and ask the LLM to generate code.
- The initial prompt includes instructions guiding the LLM to provide incremental changes and clear explanations.
- Request Patches (
-Imode):- During the chat, use the
/patchcommand to ask the LLM to provide the proposed changes in a structured format. kopipastawill prompt you to review the proposed patches (file, reasoning, code change).
- During the chat, use the
- Apply Patches (
-Imode):- If you approve,
kopipastawill attempt to automatically apply the patches to your local files. It validates that the original code exists and is unique before applying.
- If you approve,
- Test & Iterate: Test the changes locally. If further changes are needed, continue the chat, request new patches, or make manual edits.
- Commit: Once satisfied, commit the changes.
For non-interactive mode, kopipasta generates the complete prompt (context + task) and copies it to your clipboard (Step 1 & 2). You can then paste this into your preferred LLM interface and proceed manually from Step 3 onwards.
Features
- Comprehensive Context Generation: Creates structured prompts including:
- Project directory tree overview.
- Selected file contents.
- Content fetched from web URLs.
- Your specific task instructions.
- Interactive File Selection:
- Guides you through selecting files and directories.
- Option to include full file content, a snippet (first lines/bytes), or select specific code chunks/patches for large or complex files.
- Syntax highlighting during chunk selection for supported languages.
- Ignores files based on common
.gitignorepatterns and detects binary files. - Displays estimated character/token counts during selection.
- Web Content Fetching: Includes content directly from URLs. Handles JSON/CSV content types.
- Editor Integration: Opens your preferred editor (
$EDITOR) to input task instructions (if not using-t). - Environment Variable Handling: Detects potential secrets from a
.envfile in included content and prompts you to mask, skip, or keep them. - Clipboard Integration: Automatically copies the generated prompt to the clipboard (non-interactive mode).
- Interactive Chat Mode (
-I,--interactive):- Starts a chat session directly after context generation.
- Uses the
google-genailibrary to interact with Google's Gemini models. - Requires the
GOOGLE_API_KEYenvironment variable to be set. - Includes built-in instructions for the LLM to encourage clear, iterative responses.
- Patch Management (
-Imode):/patchcommand to request structured code changes from the LLM.- Prompts user to review proposed patches (reasoning, file, original/new code snippets).
- Automatic patch application to local files upon confirmation.
Configuration
- Editor: Set the
EDITORenvironment variable to your preferred command-line editor (e.g.,vim,nvim,nano,emacs,code --wait). - API Key (for
-Imode): Set theGOOGLE_API_KEYenvironment variable with your Google AI Studio API key to use the interactive chat feature.
Real life example (Non-Interactive)
Context: I had a bug where setup.py didn't include all dependencies listed in requirements.txt.
kopipasta -t "Update setup.py to read dependencies dynamically from requirements.txt" setup.py requirements.txt- Paste the generated prompt (copied to clipboard) into my preferred LLM chat interface.
- Review the LLM's proposed code.
- Copy the code and update
setup.pymanually. - Test the changes.
Real life example (Interactive)
Context: I want to refactor a function in main.py.
export GOOGLE_API_KEY="YOUR_API_KEY_HERE"(ensure key is set)kopipasta -I -t "Refactor the handle_content function in main.py to be more modular" module/main.py- The tool gathers context, shows the file size, and confirms inclusion.
- An interactive chat session starts with the context and task sent to Gemini.
- Chat with the LLM:
- User: "Proceed"
- LLM: "Okay, I understand. My plan is to..."
- User: "Looks good."
- LLM: "Here's the first part of the refactoring..." (shows code)
- Use the
/patchcommand:- User:
/patch kopipastaasks the LLM for structured patches.kopipastadisplays proposed patches: "Apply 1 patch to module/main.py? (y/N):"
- User:
- Apply the patch:
- User:
y kopipastaapplies the change tomodule/main.py.
- User:
- Test locally. If it works, commit. If not, continue chatting, request more patches, or debug.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file kopipasta-0.26.0.tar.gz.
File metadata
- Download URL: kopipasta-0.26.0.tar.gz
- Upload date:
- Size: 24.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cb82e38e87cc7e2c75fcf5a6648499672d3dbfe6954fb6f85b1efbb0a83d5965
|
|
| MD5 |
5e95434d864acbbe305379ebc2e10fef
|
|
| BLAKE2b-256 |
8e7e90036c67416864a604d7ae1f61f443ba24ca92bdca625695af31847ce8b5
|
File details
Details for the file kopipasta-0.26.0-py3-none-any.whl.
File metadata
- Download URL: kopipasta-0.26.0-py3-none-any.whl
- Upload date:
- Size: 22.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
75db5e90c767901dfeee516907370cab189e6773b9b7c76a8a41d50c0f740766
|
|
| MD5 |
9632d633b0f1d0d94dadf23d487b95d1
|
|
| BLAKE2b-256 |
cc56889b097b878c38148a25c6fd0825d30de61cb0fdd2b5247a15a1bba926ce
|