Tiny local coding CLI with a small tool surface
Project description
oy-cli
Small local AI coding CLI for your shell. It can inspect files, search content, fetch public docs, and run commands in the current workspace.
oy-cli is intentionally OpenResponses-first: provider integrations are expected to support the Open Responses / OpenAI Responses API shape, and oy is optimized around that interface rather than chat-completions compatibility layers.
Quick start
uv tool install oy-cli
oy "add docstrings to public functions"
oy chat
oy audit "focus on authentication"
Common tasks
oy "inspect the main module and suggest improvements"
OY_ROOT=./my-project oy "fix the failing tests"
echo "update the changelog" | OY_NON_INTERACTIVE=1 oy
oy chat
oy audit [focus]
oy ralph "prompt"
oy model [filter]
oy --help
In chat, /ask <question> is research-only: no bash, no file changes, but public webfetch is still allowed. It is no-write rather than no-network.
Design goals
- keep the codebase small and auditable
- expose a narrow built-in tool set
- keep provider support behind thin shims
- target providers that implement the Open Responses / OpenAI Responses API surface
- start fresh by default for one-shot runs
- make approvals and checkpoints explicit when they matter
Prompt text and tool descriptions live in oy_cli/session_text.toml. Core modules are oy_cli/runtime.py, oy_cli/agent.py, oy_cli/cli.py, oy_cli/tools.py, and oy_cli/providers.py. Contributor workflow lives in CONTRIBUTING.md.
Configuration
Environment variables
| Variable | Purpose |
|---|---|
OY_MODEL |
Override the model for this session (model or shim:model) |
OY_SHIM |
Force a shim when the model name is bare |
OY_NON_INTERACTIVE |
Set to 1 to disable approval and prompt pauses |
OY_UNATTENDED_LIMIT |
Agent deadline window, such as 1h, 30m, or 3600s |
OY_RALPH_LIMIT |
Ralph deadline window, such as 3h, 90m, or 3600s |
OY_ROOT |
Run against a different workspace |
OY_SYSTEM_FILE |
Append extra system instructions |
OY_CONFIG |
Override config path (default: ~/.config/oy/config.json) |
OY_DEBUG |
Enable debug logging |
OY_YOLO |
Start with all tool approvals enabled |
OY_MAX_CONTEXT_TOKENS |
Override transcript and tool context budget |
OY_MAX_BASH_CMD_BYTES |
Override max accepted bash command size |
Config file
{"shim": "openai", "model": "glm-5"}
Only model and shim are persisted. Selection order is OY_MODEL, then saved config, then the first-run picker. OY_SHIM only changes backend choice when the model name is bare or no model has been saved yet.
From local testing, glm-5 and kimi-k2.5 are good defaults.
Installation
uv tool install oy-cli # preferred
pip install oy-cli # alternative
Requirements
- Python 3.13+
bash- Provider credentials for a backend that supports the Open Responses / OpenAI Responses API shape (for example via OpenAI credentials, Codex auth, Copilot auth, OpenCode auth, or AWS credentials for Bedrock Mantle)
Development
Use uv for local development. Contributor workflow lives in CONTRIBUTING.md.
uv sync
uv run ruff check .
uv run pytest -q
uv run pytest tests/test_providers.py -q
uv run oy --help
uv build
Authentication
OpenAI or other Open Responses-compatible endpoint:
export OPENAI_API_KEY=...
export OPENAI_BASE_URL=https://your-endpoint.example/v1 # optional
Copilot and Codex credentials are discovered automatically when available.
Bedrock Mantle:
export OY_SHIM=bedrock-mantle
export AWS_PROFILE=my-profile
export AWS_REGION=ap-southeast-2
oy loads models from GET /models and targets the Open Responses / OpenAI Responses API at POST /responses. Provider support in oy is intentionally centered on that API shape. Providers that do not support /responses fail with a clear error instead of falling back to legacy chat-completions behavior.
Local model workflow
Run any OpenAI-compatible server on localhost. By default oy probes:
local-8080athttp://127.0.0.1:8080/v1(typicalllama-serverport)local-11434athttp://127.0.0.1:11434/v1(typical Ollama port)
Examples:
OY_MODEL=local-8080:qwen3.5 oy chat
# or save it once:
oy model local-11434:qwen3.5
oy chat
You can also target any localhost port with the local-<port> shim form.
Troubleshooting
- Missing credentials — start a local OpenAI-compatible server on
127.0.0.1:8080or127.0.0.1:11434, setOPENAI_API_KEY, sign in withcodex, authenticateghfor Copilot, runopencode auth, or configure AWS credentials / SSO for Bedrock Mantle. - stdin is not a TTY — piping input disables
ask; setOY_NON_INTERACTIVE=1to make that explicit. - AWS SSO session is stale — run
aws sso login --use-device-code --no-browser.
Security
oy can run shell commands and modify files with your permissions. bash also inherits your environment, so git, cloud, and SSH credentials visible to your shell are visible to the command.
Recommended:
- run in a repo or workspace you trust
- mount only the directories you need in containers
- avoid exposing long-lived secrets in the environment
- use
/askwhen you want no-write research mode - review generated changes before shipping
Protections include workspace-bound file tools, public-only webfetch, and default credential flows for supported providers. For provider authors, the intended compatibility target is Open Responses compliance rather than ad hoc OpenAI-compatible subsets. oy still acts with your user permissions, so treat generated shell commands and file edits as local code execution.
License
Apache License 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file oy_cli-0.4.6.tar.gz.
File metadata
- Download URL: oy_cli-0.4.6.tar.gz
- Upload date:
- Size: 82.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
02478b1fe6b4b15a9531519a0de21d9724942bf4d04a10fba2e983f134f00b9b
|
|
| MD5 |
4a2f9932de3c1bb384ac95aa5a183fe6
|
|
| BLAKE2b-256 |
421753eec21b76ba80ad34c5b4af596d69b79826027e51474ad5efb7bfc750f3
|
Provenance
The following attestation bundles were made for oy_cli-0.4.6.tar.gz:
Publisher:
release.yml on wagov-dtt/oy-cli
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
oy_cli-0.4.6.tar.gz -
Subject digest:
02478b1fe6b4b15a9531519a0de21d9724942bf4d04a10fba2e983f134f00b9b - Sigstore transparency entry: 1276990097
- Sigstore integration time:
-
Permalink:
wagov-dtt/oy-cli@cd07a7ad6761e70a2274c543053f918489984426 -
Branch / Tag:
refs/tags/v0.4.6 - Owner: https://github.com/wagov-dtt
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@cd07a7ad6761e70a2274c543053f918489984426 -
Trigger Event:
release
-
Statement type:
File details
Details for the file oy_cli-0.4.6-py3-none-any.whl.
File metadata
- Download URL: oy_cli-0.4.6-py3-none-any.whl
- Upload date:
- Size: 67.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
47e1379aae9393f8532c71d6524d3ea5bdd6a26ed3de412a61f0b83bddbcca20
|
|
| MD5 |
eee0cee2ec2fe8ebadb473946bdce9ce
|
|
| BLAKE2b-256 |
b65a732dddf3b0d7356b339a941b64e32a5bfca846032d9539c55d7a77bedce6
|
Provenance
The following attestation bundles were made for oy_cli-0.4.6-py3-none-any.whl:
Publisher:
release.yml on wagov-dtt/oy-cli
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
oy_cli-0.4.6-py3-none-any.whl -
Subject digest:
47e1379aae9393f8532c71d6524d3ea5bdd6a26ed3de412a61f0b83bddbcca20 - Sigstore transparency entry: 1276990178
- Sigstore integration time:
-
Permalink:
wagov-dtt/oy-cli@cd07a7ad6761e70a2274c543053f918489984426 -
Branch / Tag:
refs/tags/v0.4.6 - Owner: https://github.com/wagov-dtt
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@cd07a7ad6761e70a2274c543053f918489984426 -
Trigger Event:
release
-
Statement type: