Skip to main content

Evals-first prompt optimization. Label examples, get better prompts.

Project description

coaxer

Label examples. Derive the prompt. Consume it as a string.

Documentation

Motivation

Writing prompts by hand is slow, and the prose grows brittle as cases accumulate. Coaxer flips it: label examples of the behavior you want, derive the prompt from those labels -- when it drifts, add more labels instead of rewriting.

Labels are the source of truth. The prompt is a build artifact.

Install

uv add coaxer

Label

One directory per record. record.json holds scalar fields; large text and binary inputs live as sibling files.

labels/repo-classification/
  _schema.json              # optional: field descriptions + types + enums
  0001/
    record.json             # {id, inputs: {readme, stars, ...}, output}
    readme.md               # large text referenced from record.json
  0002/
    ...

_schema.json is optional. Without it, field names and types are inferred from the records.

{
  "inputs": {
    "readme": {"desc": "Project README markdown"},
    "stars": {"desc": "GitHub star count", "type": "int"}
  },
  "output": {
    "desc": "Curated collection vs organic project",
    "type": "enum",
    "values": ["true", "false"]
  }
}

Distill

coax labels/repo-classification --out prompts/repo-classification

Writes four files to the output folder:

File Purpose
prompt.jinja Human-readable Jinja template with {{ field }} slots.
meta.json Compile metadata: compiled_at, example_count, label_hash, schema.
dspy.json DSPy program state (only when --optimizer gepa).
history.jsonl Append-only compile log.

Optimizer is opt-in. --optimizer gepa runs DSPy 3's GEPA pass and requires an LLM credential. The default (--optimizer none) emits a schema-derived template and is reproducible without network.

Consume

from coaxer import CoaxedPrompt

p = CoaxedPrompt("prompts/repo-classification", role="classifier")  # bind defaults
filled = p(readme=new_readme, stars=1200)                         # render at call time
  • CoaxedPrompt(path, **bound)str subclass; __new__ reads prompt.jinja.
  • str(p) — raw template.
  • p(**vars) — Jinja2 StrictUndefined render; missing variables raise.
  • Call-time variables override bound defaults.

Because CoaxedPrompt is a str, it drops in anywhere a string is accepted (logging, OpenAI SDK messages, Anthropic SDK, DSPy signatures built externally, etc.).

Compile LLMs

AgentLM routes compile calls through the Anthropic Agent SDK (Claude Code). OpenAILM hits any OpenAI-compatible endpoint (Ollama, vLLM, OpenAI).

from coaxer import AgentLM, OpenAILM

lm = AgentLM()                                # Claude via Agent SDK
lm = OpenAILM(model="llama3")                 # Ollama
lm = OpenAILM(model="gpt-4o", base_url="https://api.openai.com/v1", api_key="sk-...")

Both pass keyword arguments through to their underlying client.

Caching

Pass a cachetta instance to file-back LM responses:

from cachetta import Cachetta
from coaxer import AgentLM

cache = Cachetta(path=lambda prompt, **kw: f"cache/{prompt}.pkl", duration="7d")
lm = AgentLM(cache=cache)

Install with the cache extra: uv add "coaxer[cache]".

Development

uv sync --extra dev
uv run just test-unit   # Unit tests
uv run just ci          # Full CI (lint + format + typecheck + tests)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coaxer-0.2.11.tar.gz (129.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

coaxer-0.2.11-py3-none-any.whl (28.7 kB view details)

Uploaded Python 3

File details

Details for the file coaxer-0.2.11.tar.gz.

File metadata

  • Download URL: coaxer-0.2.11.tar.gz
  • Upload date:
  • Size: 129.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for coaxer-0.2.11.tar.gz
Algorithm Hash digest
SHA256 9dc599b7197387ec5acfaa37cd1e300b3d8e6d26d85c5849e8e961a15967d196
MD5 bf37a022099355380924d8f9d628007a
BLAKE2b-256 9a7891162d7eda0ab101b164fda2d8dac4fca30862ac86e7f64c73f34a04f7ef

See more details on using hashes here.

File details

Details for the file coaxer-0.2.11-py3-none-any.whl.

File metadata

  • Download URL: coaxer-0.2.11-py3-none-any.whl
  • Upload date:
  • Size: 28.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.7 {"installer":{"name":"uv","version":"0.11.7","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for coaxer-0.2.11-py3-none-any.whl
Algorithm Hash digest
SHA256 ca2c0a97872924e7ddf5b29d9f84116f37155a55076bb7ad1d2e142a2177df97
MD5 b33bdff2a176b39db030e2ef4792cca2
BLAKE2b-256 402a783d2a243e40ea7de21c096650c9217969c7b1d2fd28e8fcde94babbcaca

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page