Skip to main content

Evals-first prompt optimization. Label examples, get better prompts.

Project description

coaxer

Label examples. Derive the prompt. Consume it as a string.

Full docs: https://thekevinscott.github.io/coaxer/ · ship-with-package source in docs/.

Motivation

Writing prompts by hand is slow, and the prose grows brittle as cases accumulate. Coaxer flips it: label examples of the behavior you want, derive the prompt from those labels — when it drifts, add more labels instead of rewriting.

Labels are the source of truth. The prompt is a build artifact.

Install

uv add coaxer        # Python
npm install coaxer   # TypeScript

Quick start

coax labels/repo-classification --out prompts/repo-classification
from coaxer import CoaxedPrompt

p = CoaxedPrompt("prompts/repo-classification")
filled = p(readme=new_readme, stars=1200)

The TypeScript version is in docs/guide/getting-started.md.

Getting Started

Label folder is one directory per record; record.json plus sibling files for large text or binary inputs. coax compiles the folder into a prompt artifact you load with CoaxedPrompt. Default optimizer is none (schema-derived, no network).

Full walkthrough: docs/guide/getting-started.md.

CoaxedPrompt

CoaxedPrompt(path, **bound) is a str subclass. str(p) is the raw template; p(**vars) renders it. Missing vars raise MissingVariableError. Bound defaults at construction; call-time vars override. p.fields lists the input variables the template expects.

For structured-output APIs, p.response_format is a Pydantic model derived from the compiled output schema.

Reference: docs/api/coaxed-prompt.md.

CLI

coax <labels-dir> --out <prompts-dir> [--optimizer {none,gepa}] [--output-name NAME]

Reference: docs/api/cli.md.

AgentLM

DSPy BaseLM backed by the Claude Agent SDK. **kwargs forward to ClaudeAgentOptions (tools, allowed_tools, max_turns, …).

from coaxer import AgentLM
lm = AgentLM(tools=[])

Reference: docs/api/agent-lm.md.

OpenAILM

DSPy BaseLM for any OpenAI-compatible chat endpoint (Ollama, vLLM, OpenAI, LM Studio, …).

from coaxer import OpenAILM
lm = OpenAILM(model="gpt-4o", base_url="https://api.openai.com/v1", api_key="sk-...")

Reference: docs/api/openai-lm.md.

Migrations

Downstream-consumer upgrade instructions for breaking changes live in MIGRATIONS.md (also published at docs/migrations.md). The full release log is in CHANGELOG.md.

Development

uv sync --extra dev
uv run just test-unit   # Unit tests
uv run just ci          # Full CI (lint + format + typecheck + tests)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coaxer-0.3.11.tar.gz (57.9 kB view details)

Uploaded Source

File details

Details for the file coaxer-0.3.11.tar.gz.

File metadata

  • Download URL: coaxer-0.3.11.tar.gz
  • Upload date:
  • Size: 57.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for coaxer-0.3.11.tar.gz
Algorithm Hash digest
SHA256 26700fe12d5c091cbdd719bc817ce4b3bf06d9c46d8d5c2066a16c8d3adcb408
MD5 e186b0bd995219dd1bcdcf74422b2b27
BLAKE2b-256 a05ac3dcac6c90098a278345b21996b6e65ccc1fe2612e0bd30f7c0f76acf7d6

See more details on using hashes here.

Provenance

The following attestation bundles were made for coaxer-0.3.11.tar.gz:

Publisher: release.yml on thekevinscott/coaxer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page