An interactive agent for codebase modification using LLMs
Project description
About
PatchLLM is an interactive command-line agent that helps you modify your codebase. It uses an LLM to plan and execute changes, allowing you to review and approve every step.
Key Features
- Interactive Planning: The agent proposes a step-by-step plan before writing any code. You stay in control.
- Dynamic Context: Build and modify the code context on-the-fly using powerful scope definitions (
@git:staged,@dir:src, etc.). - Mobile-First TUI: A clean, command-driven interface with autocompletion makes it easy to use on any device.
- Resilient Sessions: Automatically saves your progress so you can resume if you get disconnected.
Getting Started
1. Initialize a configuration file (optional):
This creates a scopes.py file to define reusable file collections.
patchllm --init
2. Start the Agent:
Running patchllm with no arguments drops you into the interactive agentic TUI.
patchllm
3. Follow the Agent Workflow: Inside the TUI, you direct the agent with simple slash commands.
# 1. Set the goal
>>> /task Add a health check endpoint to the API
# 2. Build the context
>>> /context @dir:src/api
# 3. Ask the agent to generate a plan
>>> /plan
1. Add a new route `/health` to `src/api/routes.py`.
2. Implement the health check logic to return a 200 OK status.
# 4. Execute the first step and review the proposed changes
>>> /run
# 5. If the changes look good, approve them
>>> /approve
Agent Commands (TUI)
| Command | Description |
|---|---|
/task <goal> |
Sets the high-level goal for the agent. |
/plan [management] |
Generates a plan, or opens an interactive TUI to edit/add/remove steps. |
/run [all] |
Executes the next step, or all remaining steps with /run all. |
/approve |
Interactively select and apply changes from the last run. |
/diff [all | file] |
Shows the full diff for the proposed changes. |
/retry <feedback> |
Retries the last step with new feedback. |
/skip |
Skips the current step and moves to the next. |
/revert |
Reverts the changes from the last /approve. |
/context <scope> |
Replaces the context with files from a scope. |
/scopes |
Opens an interactive menu to manage your saved scopes. |
/ask <question> |
Ask a question about the plan or code context. |
/refine <feedback> |
Refine the plan based on new feedback or ideas. |
/show [state] |
Shows the current state (goal, plan, context, history, step). |
/settings |
Configure the model and API keys. |
/help |
Shows the detailed help message. |
/exit |
Exits the agent session. |
Headless Mode Flags
For scripting or single-shot edits, you can still use the original flags.
| Flag | Alias | Description |
|---|---|---|
-p, --patch |
Main action: Query the LLM and apply file changes. | |
-t, --task |
Provide a specific instruction to the LLM. | |
-s, --scope |
Use a static scope from scopes.py or a dynamic one. |
|
-r, --recipe |
Use a predefined task from recipes.py. |
|
-in, --interactive |
Interactively build the context by selecting files. | |
-i, --init |
Create a new scopes.py file. |
|
-sl, --list-scopes |
List all available scopes. | |
-ff, --from-file |
Apply patches from a local file. | |
-fc, --from-clipboard |
Apply patches from the system clipboard. | |
-m, --model |
Specify a different model (default: gemini/gemini-1.5-flash). |
|
-v, --voice |
Enable voice interaction (requires voice dependencies). |
Setup
PatchLLM uses LiteLLM. Set up your API keys (e.g., OPENAI_API_KEY, GEMINI_API_KEY) in a .env file.
The interactive TUI requires prompt_toolkit and InquirerPy. You can install all core dependencies with:```bash
pip install -r requirements.txt
Optional features require extra dependencies:
```bash
# For URL support in scopes
pip install "patchllm[url]"
# For voice commands (in headless mode)
pip install "patchllm[voice]"
License
This project is licensed under the MIT License.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file patchllm-1.0.0.tar.gz.
File metadata
- Download URL: patchllm-1.0.0.tar.gz
- Upload date:
- Size: 52.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f8fee1a735d139a516fd1da689096f920cfa1da8cd4f4c69eb15885efa1e2b75
|
|
| MD5 |
c55f758c149841652f873536db8aabf8
|
|
| BLAKE2b-256 |
912c0da8452a88c04b94347c7f4bc0ef070caa1c7464941b733c0d367b3c0ee0
|
File details
Details for the file patchllm-1.0.0-py3-none-any.whl.
File metadata
- Download URL: patchllm-1.0.0-py3-none-any.whl
- Upload date:
- Size: 64.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
45409cfbfb02144ef4e7cf925ff088c3edbd572d5fd344d2d9bdeeb0e20821a9
|
|
| MD5 |
1227a8d0d7a069b1d1fa138aa442731d
|
|
| BLAKE2b-256 |
3e56334358140ebaf97c3714907298a59a06f8456d4cc063f939397fbd65dd92
|