Skip to main content

Terminal client for OpenAI chat completions

Project description

bsy-clippy

bsy-clippy is a lightweight Python client for the OpenAI Chat Completions API (and compatible deployments).

It supports both batch (stdin) mode for one-shot prompts and interactive mode for chatting directly in the terminal.
You can also load system prompts from a file to guide the LLM’s behavior.


Features

  • Speaks to the OpenAI Chat Completions API (or any compatible base URL).
  • Loads credentials from .env (OPENAI_API_KEY) using python-dotenv.
  • Reads defaults (profile, base URL, IP/port overrides, model) from bsy-clippy.yaml.
  • Toggle endpoints by editing api.profile in bsy-clippy.yaml or passing --profile on the CLI.
  • Defaults to:
    • Base URL: http://172.20.0.100:11434/v1 (profile ollama)
    • Model: qwen3:1.7b
    • Mode: stream (see --mode to switch)
    • Bundled system prompt file that can be overridden with --system-file
  • Configurable parameters:
    • -b / --base-url → explicit API endpoint
    • -i / --ip and -p / --port → override host/port when targeting compatible servers
    • -M / --model → model name
    • -m / --mode → output mode (stream or batch)
    • -t / --temperature → sampling temperature (default: 0.7)
    • -s / --system-file → path to a text file with system instructions
    • -u / --user-prompt → extra user instructions prepended before the data payload
    • -r / --memory-lines → number of conversation lines to remember in interactive mode
    • -c / --chat-after-stdin → process stdin once, then drop into interactive chat
  • Two modes of operation:
    • Batch mode → waits until the answer is complete, then prints only the final result.
    • Stream mode (default) → shows response in real-time, tokens appear as they are generated.
  • Colored terminal output:
    • Yellow = streaming tokens (the model’s “thinking” in progress).
    • Default terminal color = final assembled answer.

Installation

pipx (recommended)

pipx install .

After updating the source, reinstall with pipx reinstall bsy-clippy.

pip / virtual environments

pip install .

Configuration

API credentials (.env)

Create a .env file next to where you run bsy-clippy and add your key:

OPENAI_API_KEY=sk-...

The CLI loads this automatically via python-dotenv; environment variables from your shell work too.

YAML defaults (bsy-clippy.yaml)

bsy-clippy.yaml selects which profile to use and what settings belong to it. The packaged example ships with an Ollama profile enabled and an OpenAI profile commented out for reference:

api:
  profile: ollama
  profiles:
    ollama:
      base_url: http://172.20.0.100:11434/v1
      model: qwen3:1.7b
    # openai:
    #   base_url: https://api.openai.com/v1
    #   model: gpt-4o-mini

Change profile (or pass --profile openai) to switch endpoints, or add more entries under profiles for additional deployments.

Usage

System prompt file

By default, bsy-clippy loads a bundled prompt (Be very brief. Be very short.).
You can change this with --system-file or disable it via --no-default-system.

Example bsy-clippy.txt:

You are a helpful assistant specialized in cybersecurity.
Always explain your reasoning clearly, and avoid unnecessary markdown formatting.

These lines will be sent to the LLM before every user prompt.

User prompt parameter

Use --user-prompt "Classify the following log:" when piping data so the model receives:

system prompt (if any)

user prompt text

data from stdin or interactive input

Interactive memory

Set --memory-lines 6 (or -r 6) to keep the last six conversation lines (user + assistant) while chatting.
Only the final assistant reply (not the thinking traces) is stored and sent back on the next turn.

Chat after stdin

Use -c / --chat-after-stdin to process piped data first and then remain in interactive mode with the response (and any configured memory) available:

cat sample.txt | bsy-clippy -u "Summarize this report" -r 6 -c

After the initial answer prints, you can continue the conversation while the tool remembers the piped data and the model’s reply.


Interactive mode (default = stream)

Run without piping input:

bsy-clippy

Streaming session looks like:

You: Hello!
LLM (thinking): <think>
Reasoning step by step...
</think>
Hello! How can I assist you today? 😊

Prefer a single print at the end? Switch to batch mode:

bsy-clippy --mode batch

Batch output:

You: Hello!
Hello! How can I assist you today? 😊

Batch mode (stdin)

Pipe input directly:

echo "Tell me a joke" | bsy-clippy

Output:

Why don’t scientists trust atoms? Because they make up everything!

Forcing modes

bsy-clippy --mode batch
bsy-clippy --mode stream

Adjusting temperature

bsy-clippy --temperature 0.2
bsy-clippy --temperature 1.2

Custom server and model

bsy-clippy --base-url http://127.0.0.1:11434/v1 --model llama2

Requirements

See requirements.txt.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bsy_clippy-0.2.0.tar.gz (18.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bsy_clippy-0.2.0-py3-none-any.whl (16.7 kB view details)

Uploaded Python 3

File details

Details for the file bsy_clippy-0.2.0.tar.gz.

File metadata

  • Download URL: bsy_clippy-0.2.0.tar.gz
  • Upload date:
  • Size: 18.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for bsy_clippy-0.2.0.tar.gz
Algorithm Hash digest
SHA256 aab4da909e2d9f6836be7ea84aaddb5d96165502f2b8586343d90bbc777aa684
MD5 e999c5fe76cda753b073e29f822efcd0
BLAKE2b-256 5616941363f822944dfcb20997da94bb1cc830ef45260e9ece43f7eec947fd60

See more details on using hashes here.

File details

Details for the file bsy_clippy-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: bsy_clippy-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 16.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for bsy_clippy-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4b72ae04ff5140e88c36e2017982111b0548bc2d6ee9ac0b867b569273ae9017
MD5 60939af5879e5884dba5108605e94c3b
BLAKE2b-256 c03a8df314ad58c545f149d6cf5dddf425ba1c09832c6bd3c3e8546b9b1b3182

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page