Prompt to pipe
Project description
A command-line tool that lets you pipe input into customizable AI prompts.
Example Usage
# Generate a commit message from staged changes
git diff --staged | p2p create-commit-message
# Example explanation of the above command
# This command takes the output of `git diff --staged` and pipes it into the `p2p` tool
# to generate a commit message based on the changes.
Example Output
refactor: reorder imports in cli.py and update flake8 select list
- Reorganize the import order in `pipe2prompt/cli.py` for better readability.
- Remove the `isort` ("I") code from the flake8 select list in `pyproject.toml`.
Installation
To install the package, you can use pip:
pip install pipe2prompt
To install from source:
git clone https://github.com/digsy89/pipe2prompt
cd pipe2prompt
pip install --user .
This will:
Install the package locally with pip
Set up shell completion
Create initial config file at ~/.p2p/config.toml
OpenAI API Key
This tool requires an OpenAI API key to function. You can set it up in two ways:
export OPENAI_API_KEY=your-api-key-here
Shell Completion
To manually set up completion:
p2p init
This will:
Create completion scripts in your shell’s completion directory
Add source commands to your shell config file
Enable tab completion for p2p commands and prompts
Supported shells:
Bash (~/.bash_completion.d/_p2p)
Zsh (~/.zsh/completions/_p2p)
Fish (~/.config/fish/completions/_p2p)
After installation, you may need to restart your shell or source your config file:
# For bash
source ~/.bashrc
# For zsh
source ~/.zshrc
# For fish
source ~/.config/fish/config.fish
Usage
Basic usage:
# Run a prompt directly
p2p <prompt-name> "your input"
# Pipe input into a prompt
echo "your input" | p2p <prompt-name>
# List available prompts
p2p prompt list
p2p prompt list --long
Prompt Configuration
Prompts are configured in ~/.p2p/config.toml. Example configuration:
[explain]
content = "Explain this code: {pipe}"
base_model = "gpt-3.5-turbo"
description = "Explain code"
[fix]
content = "Fix this code and explain the issues: {pipe}"
base_model = "gpt-3.5-turbo"
description = "Fix code issues"
Each prompt requires:
content: The prompt template. Use {pipe} to reference piped input
base_model: The OpenAI model to use
description: Description shown in help text
enabled: Optional boolean to enable/disable the prompt
You can find example configurations in the project’s [pipe2prompt/config.toml](pipe2prompt/config.toml) file for reference.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pipe2prompt-0.1.1.tar.gz.
File metadata
- Download URL: pipe2prompt-0.1.1.tar.gz
- Upload date:
- Size: 7.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cff2a0b6f3128359361a258732da390b5ca967ba1f9e207174c8fdb0c763c841
|
|
| MD5 |
35949b69ac29f334846ee9ec627182a2
|
|
| BLAKE2b-256 |
040c5b6edb861ac71d35395683d1d891784776d6ef9401b78ddec381bc061591
|
File details
Details for the file pipe2prompt-0.1.1-py3-none-any.whl.
File metadata
- Download URL: pipe2prompt-0.1.1-py3-none-any.whl
- Upload date:
- Size: 8.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bd6e1cbf38ec9c03811cc7bc490461038be88940fa57955f9bff4c4539825f59
|
|
| MD5 |
636c7a207720b771a99ef08cbe51e63d
|
|
| BLAKE2b-256 |
f7ba2486b92a7b132e08cb038b9e4276718a30cb67fbc98afb92ca3e1d46b24e
|