Terminal client for OpenAI chat completions
Project description
bsy-clippy
bsy-clippy is a lightweight Python client for the OpenAI Chat Completions API (and compatible deployments).
It supports both batch (stdin) mode for one-shot prompts and interactive mode for chatting directly in the terminal.
You can also load system prompts from a file to guide the LLM’s behavior.
Features
- Speaks to the OpenAI Chat Completions API or any OpenAI-compatible deployment.
- Loads credentials from
.env(OPENAI_API_KEY) viapython-dotenvwhen the selected profile requires them. - Reads defaults (profile, provider, base URL, model) from
bsy-clippy.yamland falls back to a packaged sample if none is found. - Switch profiles by editing
api.profileor passing--profile(e.g.--profile openai) on the CLI. - Defaults to:
- Provider:
ollama - Base URL:
http://172.20.0.100:11434 - Model:
qwen3:1.7b - Mode:
stream(see--modeto switch) - Bundled system prompt file that can be overridden with
--system-file
- Provider:
- Configurable parameters:
--config/--profile→ select a profile file and profile name-b/--base-url,-i/--ip,-p/--port→ override connection details per run-M/--model,-t/--temperature,-m/--mode→ tuning controls-s/--system-file,-u/--user-prompt→ additional prompting knobs-r/--memory-lines,-c/--chat-after-stdin→ conversation persistence options
- Two modes of operation:
- Batch mode → waits until the answer is complete, then prints only the final result.
- Stream mode (default) → shows response in real-time, tokens appear as they are generated.
- Colored terminal output:
- Yellow = streaming tokens (the model’s “thinking” in progress).
- Default terminal color = final assembled answer.
Installation
pipx (recommended)
pipx install .
After updating the source, reinstall with pipx reinstall bsy-clippy.
pip / virtual environments
pip install .
Configuration
API credentials (.env)
Create a .env file next to where you run bsy-clippy and add your key:
OPENAI_API_KEY=sk-...
The CLI loads this automatically via python-dotenv; environment variables from your shell work too. Only the openai profile requires this token — ollama profiles can leave it unset.
YAML defaults (bsy-clippy.yaml)
bsy-clippy.yaml selects which profile to use and what settings belong to it. The CLI looks for this file in the current working directory, then in ~/.config/bsy-clippy/, and finally falls back to the bundled sample. Copy the sample to your project or config directory, edit it, and switch profiles as needed:
api:
profile: ollama
profiles:
ollama:
provider: ollama
base_url: http://172.20.0.100:11434
model: qwen3:1.7b
# openai:
# provider: openai
# base_url: https://api.openai.com/v1
# model: gpt-4o-mini
# api_key_env: OPENAI_API_KEY
Switching profiles
- Keep
profile: ollama(the default) to talk to an Ollama server; no API token is required. Adjustbase_urlif your host or port differs. - To use OpenAI, either set
profile: openaiin the file or pass--profile openaion the command line. Make sure theOPENAI_API_KEYenvironment variable (or the name set inapi_key_env) is populated before running the command. - You can define additional profiles (for example, staging clusters) under
api.profilesand select them with--profile <name>.
Quick check
pip install -r requirements.txt # or pipx install .
cp src/bsy_clippy/data/bsy-clippy.yaml ./bsy-clippy.yaml
python bsy-clippy.py --help # confirms dependencies are in place
After copying the sample config you can edit it in-place and re-run bsy-clippy to target a different profile.
If you installed via pipx or pip, copy the bundled sample with:
python -c "import importlib.resources as r, pathlib; pathlib.Path('bsy-clippy.yaml').write_text(r.files('bsy_clippy').joinpath('data/bsy-clippy.yaml').read_text())"
Usage
System prompt file
By default, bsy-clippy loads a bundled prompt (Be very brief. Be very short.).
You can change this with --system-file or disable it via --no-default-system.
Example bsy-clippy.txt:
You are a helpful assistant specialized in cybersecurity.
Always explain your reasoning clearly, and avoid unnecessary markdown formatting.
These lines will be sent to the LLM before every user prompt.
User prompt parameter
Use --user-prompt "Classify the following log:" when piping data so the model receives:
system prompt (if any)
user prompt text
data from stdin or interactive input
Interactive memory
Set --memory-lines 6 (or -r 6) to keep the last six conversation lines (user + assistant) while chatting.
Only the final assistant reply (not the thinking traces) is stored and sent back on the next turn.
Chat after stdin
Use -c / --chat-after-stdin to process piped data first and then remain in interactive mode with the response (and any configured memory) available:
cat sample.txt | bsy-clippy -u "Summarize this report" -r 6 -c
After the initial answer prints, you can continue the conversation while the tool remembers the piped data and the model’s reply.
Interactive mode (default = stream)
Run without piping input:
bsy-clippy
Streaming session looks like:
You: Hello!
LLM (thinking): <think>
Reasoning step by step...
</think>
Hello! How can I assist you today? 😊
Prefer a single print at the end? Switch to batch mode:
bsy-clippy --mode batch
Batch output:
You: Hello!
Hello! How can I assist you today? 😊
Batch mode (stdin)
Pipe input directly:
echo "Tell me a joke" | bsy-clippy
Output:
Why don’t scientists trust atoms? Because they make up everything!
Forcing modes
bsy-clippy --mode batch
bsy-clippy --mode stream
Adjusting temperature
bsy-clippy --temperature 0.2
bsy-clippy --temperature 1.2
Custom server and model
bsy-clippy --profile ollama --base-url http://127.0.0.1:11434 --model llama2
Requirements
See requirements.txt.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bsy_clippy-0.3.0.tar.gz.
File metadata
- Download URL: bsy_clippy-0.3.0.tar.gz
- Upload date:
- Size: 24.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bdd03cf9a3a02a790fff475309f7b77a88827972b4ab9dfd571ddaa3e928350b
|
|
| MD5 |
d9eca6f9062b7b34f35cb5d5b3eebb69
|
|
| BLAKE2b-256 |
19b84536992120198974bf94a5fbb203ceba16520e8160a0c96e9c147b77d6ee
|
File details
Details for the file bsy_clippy-0.3.0-py3-none-any.whl.
File metadata
- Download URL: bsy_clippy-0.3.0-py3-none-any.whl
- Upload date:
- Size: 22.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0f832d6c1d1a4ec8aa20b311c986649f57e4f9ccd20230afcb8a5d160ac99b1c
|
|
| MD5 |
ac45e77e5404170a903e52ff7903c2e6
|
|
| BLAKE2b-256 |
fa9ce6dc6d23a3e42048b06ffeb6dfeacef7669bf1b2dbb7d3e0f0ec96609f5d
|