Skip to main content

Tiny local coding CLI with a small tool surface

Project description

oy-cli

PyPI

AI coding assistant for your shell. Reads files, searches content, and runs commands.

uv tool install oy-cli
oy "add docstrings to public functions"

Examples

# Basic usage
oy "inspect the main module and suggest improvements"

# Work in a specific directory
OY_ROOT=./my-project oy "fix the failing tests"

# Non-interactive mode (CI/pipelines)
echo "update the changelog" | OY_NON_INTERACTIVE=1 oy

# Security audit
oy audit
oy audit "focus on authentication"

Commands

oy "prompt"              # Run with a prompt (default)
oy chat                   # Interactive multi-turn session
oy audit                  # Security audit against OWASP ASVS/MASVS
oy ralph "prompt"         # Re-run a prompt in yolo mode every minute until OY_RALPH_LIMIT (default: 3h)
oy model                  # Show current model, pick model from available endpoints
oy --help                 # Show all commands

Why This Exists

oy is small, auditable, and built around a narrow tool surface.

Design goals: small auditable codebase, minimal tool surface, OpenAI-completions-focused CLI loop, multiple backends behind shims, new session each run, and explicit checkpoints when needed.

Session Text and Prompts

All text that is sent as part of model sessions lives in oy_cli/session_text.toml.

That includes:

  • base system prompt text
  • interactive/non-interactive prompt suffixes
  • audit prompt text
  • research-only /ask suffix
  • transcript compaction text (Current todo list, omitted-history note, TOON packed-history note)
  • built-in tool descriptions exposed to the model

Code that reads and composes this content now lives mainly in oy_cli/runtime.py, with transcript/agent flow in oy_cli/agent.py and CLI entrypoints in oy_cli/cli.py.

Configuration

Environment variables:

Variable Purpose
OY_MODEL Override model for this session (bare name or shim:model)
OY_SHIM Force a specific shim: openai, codex, copilot, opencode, opencode-go, or bedrock-mantle
OY_NON_INTERACTIVE Set to 1 to disable approval/checkpoint pauses
OY_UNATTENDED_LIMIT Agent turn deadline window, like 1h, 30m, or 3600s
OY_RALPH_LIMIT Ralph deadline window, like 3h, 90m, or 3600s
OY_ROOT Run against different workspace
OY_SYSTEM_FILE Append extra system instructions
OY_CONFIG Override config path (default: ~/.config/oy/config.json)

Config file (~/.config/oy/config.json):

{"shim": "openai", "model": "glm-5"}

The shim field pins which backend to use regardless of what else is signed in. Use oy model <filter> to pick interactively; it merges models from available signed-in shims into a single list using shim:model prefixes.

On first run, if no model is configured, oy prompts you to pick one from the available backends. Set OY_MODEL, OY_SHIM, or save a config with oy model to pin behavior.

Model notes: From testing, glm-5 balances intelligence, cost, and tool-use ability. kimi-k2.5 is another option. The Artificial Analysis Comparison of Open Source Models is a reference.

Requirements

  • Python 3.13+
  • bash
  • OpenAI API key or compatible endpoint credentials, Codex local auth, Copilot auth, OpenCode auth, or AWS CLI configured for Bedrock Mantle

Installation

uv tool install oy-cli  # Preferred
pip install oy-cli       # Alternative

Development

For local development, linting, tests, and builds, use uv. Do not run bare pytest, ruff, or pip install -e . commands in this repo.

uv sync
uv run ruff format .
uv run ruff check .
uv run python -m pytest tests/ -v
uv run oy --help
uv build

See CONTRIBUTING.md for the contributor workflow.

Authentication

OpenAI:

export OPENAI_API_KEY=sk-...

For OpenAI-compatible endpoints:

export OPENAI_BASE_URL=https://your-endpoint.example/v1
export OPENAI_API_KEY=...

Copilot and Codex (OpenAI) creds are introspected and used, if creds are available oy model will show them in the model list.

AWS Bedrock Mantle: oy uses the Bedrock Mantle OpenAI-compatible endpoint (https://bedrock-mantle.<region>.api.aws/v1) and signs requests directly with SigV4 service bedrock-mantle.

export OY_SHIM=bedrock-mantle
export AWS_PROFILE=my-profile
export AWS_REGION=ap-southeast-2

oy loads models from GET /models on the Mantle endpoint and sends chat requests to POST /chat/completions on the same endpoint.

Troubleshooting

"Missing API credentials" -> Set OPENAI_API_KEY, sign in with codex, authenticate gh for Copilot, run opencode auth, or for Bedrock Mantle configure AWS credentials / SSO and set AWS_REGION.

"stdin is not a TTY" -> Piping input disables ask. Set OY_NON_INTERACTIVE=1 to make explicit.

"AWS SSO session is stale" -> Run aws sso login --use-device-code --no-browser.

Security

oy can run shell commands and modify files with your permissions. Treat it like any other local automation tool.

Recommended:

  • run in a repo or workspace you trust
  • mount only needed directories in containers
  • avoid exposing long-lived secrets in the environment
  • review generated changes before shipping

Protections: workspace-bound file access for built-in file tools and default SDK credential flows for supported providers.

Links

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oy_cli-0.4.3.tar.gz (76.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

oy_cli-0.4.3-py3-none-any.whl (66.9 kB view details)

Uploaded Python 3

File details

Details for the file oy_cli-0.4.3.tar.gz.

File metadata

  • Download URL: oy_cli-0.4.3.tar.gz
  • Upload date:
  • Size: 76.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for oy_cli-0.4.3.tar.gz
Algorithm Hash digest
SHA256 0942a70ee664ffe7ee05569dd5a0a2d02bf1802bbaabe2d3ffedfafd120fbc57
MD5 02bf6e975c201782f6fc019e88b77c5a
BLAKE2b-256 596ced1de0799b303f9ffdf75febda2426f4f717552128411c6c3d7153099700

See more details on using hashes here.

Provenance

The following attestation bundles were made for oy_cli-0.4.3.tar.gz:

Publisher: release.yml on wagov-dtt/oy-cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file oy_cli-0.4.3-py3-none-any.whl.

File metadata

  • Download URL: oy_cli-0.4.3-py3-none-any.whl
  • Upload date:
  • Size: 66.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for oy_cli-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 6e2d76a814dbf67b6b2156a1fd7c20bad4cf7dba79fe3b13e4b345fc93a1be55
MD5 cc94592099ac5d5a899e13709471fa18
BLAKE2b-256 e726a03abeee7e14bd2b186b28329f49cdb0723bf45e4314b5b53bfb6cadcf64

See more details on using hashes here.

Provenance

The following attestation bundles were made for oy_cli-0.4.3-py3-none-any.whl:

Publisher: release.yml on wagov-dtt/oy-cli

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page