A simple CLI for interacting with OpenAI models.
Project description
qork
A simple, beautiful CLI for asking LLMs questions from your terminal. Fast defaults, clean output, and per-shell conversational context.
Highlights
- Default backend: OpenAI Responses API (non‑streaming), zero config beyond your API key
- Optional Chat Completions path with streaming
- Per‑shell state: remembers the last conversation id only for the current shell session
- Plaintext or pretty Rich output
Install
pip install qork
Prerequisites
- Environment:
OPENAI_API_KEYmust be set - Models: you can pass
-m/--modelat call time; otherwise defaults apply (see below)
Quick start
- Default (Responses API):
qork "Say hello in one short sentence."
- Responses API with explicit model:
qork -m gpt-5-mini "Give me a five-word poem."
- Chat Completions (streaming by default):
qork --chat "List 3 colors."
- Chat non‑streaming:
qork --chat --no-stream "Summarize: qork is..."
- Plaintext output (easier to copy/paste):
qork -pt "Plain output please."
qork --chat -pt "Also works for chat."
- Debug info (tokens/cost where available):
qork -d "How many seconds in a day?"
qork --chat -d --no-stream "Explain Big-O of binary search."
Backends and defaults
- Responses API (default)
- Selected unless you pass
--chat - Non‑streaming
- Default model if not specified:
gpt-5-mini
- Selected unless you pass
- Chat Completions (
--chat)- Streaming by default; use
--no-streamto disable - Model comes from
-m/--modelor fromQORK_MODELenv var, else falls back to library default
- Streaming by default; use
Per‑shell session behavior (simple and automatic)
qork keeps only a single previous_response_id per shell session to thread your Responses API calls.
- A shell session is identified by TTY when available, otherwise by parent PID
- State file lives at:
~/.qork/sessions/<session_key>.json - File format:
{
"previous_response_id": "resp_abc123",
"updated_at": "2025-09-04T12:34:56Z"
}
- New shell → new session file → no prior context
- Each
qork -r(default path) uses the stored id (if any) and overwrites it with the latest id - To reset: simply open a new shell, or delete the file in
~/.qork/sessions/
Note: Session state is only used for the Responses API path. The Chat Completions path does not maintain threads.
CLI flags
-m, --modelSet model name-r, --responsesUse Responses API (default)--chatUse Chat Completions backend-ns, --no-streamDisable streaming (chat only)-pt, --plaintextPlain stdout (no rich panels/markdown)-d, --debugShow token usage/cost when available
Python API
Call from notebooks and scripts using the same behavior as the CLI.
from qork.ask import ask
# Default pathway (Responses API)
text = ask("One short sentence.", responses=True, plaintext=True, return_text=True)
# Chat Completions (non-streaming)
text = ask("Explain merge sort in one paragraph.", stream=False, responses=False, return_text=True)
# Chat Completions (streaming)
text = ask("Print three facts.", stream=True, responses=False, return_text=True)
Parameters you’ll likely use:
prompt: str(required)model: Optional[str]responses: bool(True for Responses API; False for Chat)stream: Optional[bool](Chat only; default True)plaintext: bool(stdout formatting)debug: bool(token/cost info)return_text: bool(return text value in addition to printing)
Examples
- Continue a short thread in the same shell (Responses API is default):
qork "Start a thread in one sentence."
qork "Continue in one sentence."
- Switch to Chat Completions with streaming:
qork --chat "Stream five words only."
Tests (end‑to‑end)
These tests hit live APIs (no mocks). Set your key first.
export OPENAI_API_KEY=sk-...
pytest -q tests/test_e2e_cli.py::test_cli_responses_session_persistence
pytest -q tests/test_e2e_cli.py::test_cli_chat_non_stream_plaintext
pytest -q tests/test_e2e_cli.py::test_cli_chat_stream_plaintext
pytest -q tests/test_e2e_python_api.py
You can select a model for tests with QORK_E2E_MODEL or rely on defaults.
Troubleshooting
- “API key not set”: ensure
OPENAI_API_KEYis exported in your shell - No session carry‑over: you likely opened a new shell (that’s expected); check
~/.qork/sessions/ - Switch backends:
--chatfor Chat Completions, Responses is the default
Designed for fast, accurate answers from the terminal with minimal ceremony.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file qork-0.0.10.tar.gz.
File metadata
- Download URL: qork-0.0.10.tar.gz
- Upload date:
- Size: 13.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f530bca82ed4fd34dbd2b4857ee8d3cd19154cb6fa814ecb7f131915248350b0
|
|
| MD5 |
5b0c1a4b9879675dbeb5954e0c0ffee5
|
|
| BLAKE2b-256 |
6b4786762c9f0557760192fb1d8250e6eec984dbc029953e2065e138c360331f
|
File details
Details for the file qork-0.0.10-py3-none-any.whl.
File metadata
- Download URL: qork-0.0.10-py3-none-any.whl
- Upload date:
- Size: 11.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3230143586cf182704364c9be8dbf826c39e98c7880160d68d51c4b624efad04
|
|
| MD5 |
ec6138e1f7862c6f143a537c1447726f
|
|
| BLAKE2b-256 |
467eca13516296bb4147c35102454c40b605688ee14d0c02c5a02ec0528e3f81
|