Terminal client for interacting with an Ollama server
Project description
bsy-clippy
bsy-clippy is a lightweight Python client for interacting with an Ollama server.
It supports both batch (stdin) mode for one-shot prompts and interactive mode for chatting directly in the terminal.
You can also load system prompts from a file to guide the LLM’s behavior.
Features
- Connects to Ollama API over HTTP (
/api/generate). - Defaults to:
- IP:
172.20.0.100 - Port:
11434 - Model:
qwen3:1.7b - Mode:
batch(wait for full output) - Bundled system prompt file that can be overridden with
--system-file
- IP:
- Configurable parameters:
-i/--ip→ Ollama server IP-p/--port→ Ollama server port-M/--model→ model name-m/--mode→ output mode (streamorbatch)-t/--temperature→ sampling temperature (default:0.7)-s/--system-file→ path to a text file with system instructions-u/--user-prompt→ extra user instructions prepended before the data payload-r/--memory-lines→ number of conversation lines to remember in interactive mode-c/--chat-after-stdin→ process stdin once, then drop into interactive chat
- Two modes of operation:
- Batch mode (default) → waits until the answer is complete, then prints only the final result.
- Stream mode → shows response in real-time, tokens appear as they are generated.
- Colored terminal output:
- Yellow = streaming tokens (the model’s “thinking” in progress).
- Default terminal color = final assembled answer.
Installation
pipx (recommended)
pipx install .
After updating the source, reinstall with pipx reinstall bsy-clippy.
pip / virtual environments
pip install .
Usage
System prompt file
By default, bsy-clippy loads a bundled prompt (Be very brief. Be very short.).
You can change this with --system-file or disable it via --no-default-system.
Example bsy-clippy.txt:
You are a helpful assistant specialized in cybersecurity.
Always explain your reasoning clearly, and avoid unnecessary markdown formatting.
These lines will be sent to the LLM before every user prompt.
User prompt parameter
Use --user-prompt "Classify the following log:" when piping data so the model receives:
system prompt (if any)
user prompt text
data from stdin or interactive input
Interactive memory
Set --memory-lines 6 (or -r 6) to keep the last six conversation lines (user + assistant) while chatting.
Only the final assistant reply (not the thinking traces) is stored and sent back on the next turn.
Chat after stdin
Use -c / --chat-after-stdin to process piped data first and then remain in interactive mode with the response (and any configured memory) available:
cat sample.txt | bsy-clippy -u "Summarize this report" -r 6 -c
After the initial answer prints, you can continue the conversation while the tool remembers the piped data and the model’s reply.
Interactive mode (default = batch)
Run without piping input:
bsy-clippy
Example session in batch mode:
You: Hello!
Hello! How can I assist you today? 😊
To force streaming mode:
bsy-clippy --mode stream
Streaming session looks like:
You: Hello!
LLM (thinking): <think>
Reasoning step by step...
</think>
Hello! How can I assist you today? 😊
Batch mode (stdin)
Pipe input directly:
echo "Tell me a joke" | bsy-clippy
Output:
Why don’t scientists trust atoms? Because they make up everything!
Forcing modes
bsy-clippy --mode batch
bsy-clippy --mode stream
Adjusting temperature
bsy-clippy --temperature 0.2
bsy-clippy --temperature 1.2
Custom server and model
bsy-clippy --ip 127.0.0.1 --port 11434 --model llama2
Requirements
See requirements.txt.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bsy_clippy-0.1.0.tar.gz.
File metadata
- Download URL: bsy_clippy-0.1.0.tar.gz
- Upload date:
- Size: 15.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fcf7188e0d718bb755a50035686609024679cdc3058a3f2491fae4c5833b1334
|
|
| MD5 |
bc889db16af281cd6b67c7dda9f80e5b
|
|
| BLAKE2b-256 |
46d76a2c0d06cef251ef3da18828eb056b51ba31e163c72e2b53ec197e76f8c9
|
File details
Details for the file bsy_clippy-0.1.0-py3-none-any.whl.
File metadata
- Download URL: bsy_clippy-0.1.0-py3-none-any.whl
- Upload date:
- Size: 14.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5980bfed535f18331eb7573269bb020fdd88b5b0b0b9832a2b804a2b1bdc22c4
|
|
| MD5 |
a64d3c3e67b7e62708428de4b4fdaf54
|
|
| BLAKE2b-256 |
18e4e6c520207475d099024ad8fb196a1e8c50874900186d56f10964dc414fff
|