Skip to main content

Python SDK for openHarness — drive the `oh` terminal coding agent from Python.

Project description

openharness — Python SDK for openHarness

Drive the oh terminal coding agent from Python. Stream tokens and tool calls, control permissions and models, all with a small async API.

Prerequisite

Install the oh CLI first (via npm):

npm install -g @zhijiewang/openharness

The Python SDK finds oh on PATH. To point at a specific binary, set OH_BINARY=/absolute/path/to/oh.

Install

pip install openharness-sdk

The PyPI distribution is openharness-sdk (the shorter openharness name is taken by an unrelated project). The import path remains from openharness import ....

Requires Python ≥3.10. Small dependency surface: the mcp SDK and uvicorn (pulled in to host Python-defined tools as MCP servers). Both are optional to usequery() without tools= never touches them at runtime — but they're installed eagerly for simplicity.

Quick start

import asyncio
from openharness import query, TextDelta, ToolStart, ToolEnd

async def main() -> None:
    async for event in query(
        "Summarize the README.md in this directory.",
        model="ollama/llama3",
        permission_mode="trust",
        max_turns=5,
    ):
        if isinstance(event, TextDelta):
            print(event.content, end="", flush=True)
        elif isinstance(event, ToolStart):
            print(f"\n[tool: {event.tool}]", flush=True)
        elif isinstance(event, ToolEnd):
            print(f"[{event.tool}{'error' if event.error else 'ok'}]", flush=True)

asyncio.run(main())

Multi-turn sessions

For conversations that span multiple prompts (notebooks, chatbots, agents), use OpenHarnessClient:

import asyncio
from openharness import OpenHarnessClient, TextDelta

async def main() -> None:
    async with OpenHarnessClient(model="ollama/llama3", permission_mode="trust") as client:
        async for event in await client.send("What is 1+1?"):
            if isinstance(event, TextDelta):
                print(event.content, end="")
        print()
        async for event in await client.send("And times 3?"):  # remembers the prior turn
            if isinstance(event, TextDelta):
                print(event.content, end="")

asyncio.run(main())

The client keeps a single oh session subprocess warm across calls, preserving conversation state in-process. Concurrent send() calls on one client are serialized via an asyncio.Lock. Call close() (or exit the async context) to terminate the subprocess.

Custom Python tools

Expose your own Python functions to the agent. Decorate with @tool, then pass the callables via tools=[...] on either query() or OpenHarnessClient:

import asyncio
from openharness import OpenHarnessClient, ToolEnd, tool


@tool
async def get_weather(city: str) -> str:
    """Fetch the current weather for a city."""
    return f"Sunny in {city}, 22°C"


async def main() -> None:
    async with OpenHarnessClient(
        model="ollama/llama3",
        tools=[get_weather],
    ) as client:
        async for event in await client.send("What's the weather in Paris?"):
            if isinstance(event, ToolEnd):
                print(event.tool, event.output)


asyncio.run(main())

Under the hood the SDK spins up an in-process MCP HTTP server on a random 127.0.0.1 port, writes an ephemeral .oh/config.yaml pointing at it, and runs oh with that temp dir as its cwd. Any existing user config at the caller-supplied cwd= is preserved.

Use @tool(name="custom-name", description="…") to override the auto-inferred metadata. Sync and async functions both work.

Custom permission gate

Pass can_use_tool=<callback> on either query() or OpenHarnessClient to make every permission check round-trip through Python. Useful for Jupyter notebooks, CI policy gates, or any scenario where you want to decide per-tool whether the agent may run it.

import asyncio
from openharness import OpenHarnessClient

async def gate(ctx):
    # ctx contains "toolName", "toolInputJson", and other context fields.
    if ctx["toolName"] == "Bash":
        return {"decision": "deny", "reason": "Bash is not allowed in this notebook"}
    return "allow"

async def main() -> None:
    async with OpenHarnessClient(model="ollama/llama3", can_use_tool=gate) as client:
        async for event in await client.send("List the current directory"):
            print(event)

asyncio.run(main())

Callbacks may return:

  • a bare decision string: "allow", "deny", or "ask" (fall through to the CLI's interactive prompt);
  • a dict: {"decision": "allow", "reason": "trusted"}.

Sync and async callbacks both work. Exceptions and timeouts default to deny (fail-closed), so a misbehaving gate can never silently allow. Requires @zhijiewang/openharness v2.16.0+.

API

query(prompt, **options) -> AsyncIterator[Event]

Run a single prompt and stream events as they arrive. Options:

Option Type Default Description
model str | None from config Model string (e.g. "ollama/llama3", "claude-sonnet-4-6").
permission_mode str "trust" One of "ask", "trust", "deny", "acceptEdits", "plan", "auto", "bypassPermissions".
allowed_tools Sequence[str] | None None Whitelist of tool names.
disallowed_tools Sequence[str] | None None Blacklist of tool names.
max_turns int 20 Maximum number of model turns.
system_prompt str | None None Override the default system prompt.
cwd str | None current dir Working directory for the spawned CLI.
env dict[str, str] | None None Env vars merged on top of os.environ.
tools Sequence[Callable] | None None Python callables (optionally @tool-decorated) to expose to the agent via an in-process MCP server.
can_use_tool Callable[[ctx], "allow"|"deny"|"ask"] | None None Permission callback — sync or async. When set, every permission check routes through this function. See "Custom permission gate" above. Requires CLI v2.16.0+.

Event types

All events are frozen dataclasses. Use isinstance to discriminate.

  • TextDelta(content: str) — streaming text from the assistant
  • ToolStart(tool: str) — the assistant is about to call a tool
  • ToolEnd(tool: str, output: str, error: bool) — tool invocation finished
  • ErrorEvent(message: str) — recoverable error during the turn
  • CostUpdate(input_tokens: int, output_tokens: int, cost: float, model: str) — cost + usage
  • TurnComplete(reason: str) — a sub-agent turn ended
  • TurnStart(turn_number: int) — a top-level agent turn began (CLI v2.16.0+)
  • TurnStop(turn_number: int, reason: str) — a top-level agent turn ended; mirrors Claude Code's Stop hook (CLI v2.16.0+)
  • HookDecision(event: str, tool: str | None, decision: str, reason: str | None) — a hook produced a permission decision (CLI v2.16.0+)
  • UnknownEvent(raw: dict) — forward-compatibility shim for future event types

Exceptions

  • OhBinaryNotFoundError — raised when oh can't be located on PATH or via OH_BINARY.
  • OpenHarnessError — raised when the subprocess exits non-zero. Has .stderr and .exit_code attributes.

Cancellation

Standard asyncio cancellation works. The spawned subprocess is sent SIGTERM (SIGBREAK on Windows) and given up to 5 seconds to exit cleanly before being kill()ed.

task = asyncio.create_task(collect_events())
await asyncio.sleep(1)
task.cancel()

Relationship to @zhijiewang/openharness

This Python package is a thin subprocess wrapper around the oh CLI shipped by the npm package @zhijiewang/openharness. It does not re-implement the agent loop. This means:

  • You always get the latest CLI features by upgrading the npm package.
  • All providers (Anthropic, OpenAI, Ollama, OpenRouter, llama.cpp, LM Studio) work as-is.
  • All tools and MCP servers configured in .oh/config.yaml apply.
  • The Python SDK follows its own independent SemVer track (0.x series at launch).

License

MIT. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openharness_sdk-0.4.0.tar.gz (19.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openharness_sdk-0.4.0-py3-none-any.whl (24.2 kB view details)

Uploaded Python 3

File details

Details for the file openharness_sdk-0.4.0.tar.gz.

File metadata

  • Download URL: openharness_sdk-0.4.0.tar.gz
  • Upload date:
  • Size: 19.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for openharness_sdk-0.4.0.tar.gz
Algorithm Hash digest
SHA256 cefd42ec242b52c692499419d1a5f7a5da01b7a620c53fa9810b0e39f4e39a35
MD5 c286d33949e21735e4e99dacf8d11cb3
BLAKE2b-256 71958575310e5bdf0b84da53a70f664245c4ff29c19ef7420583b31ae852d9d2

See more details on using hashes here.

Provenance

The following attestation bundles were made for openharness_sdk-0.4.0.tar.gz:

Publisher: publish-python.yml on zhijiewong/openharness

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file openharness_sdk-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: openharness_sdk-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 24.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for openharness_sdk-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 803101843739cc43fb6bca690480513dfeb722a774e483fcd58374cb758b1b18
MD5 5191536c13d400c30afe7e68725210bf
BLAKE2b-256 d3e9444cd9e8f62b30fb98bbd0d9a141b674d03b7fe5cd19eb4f34a9360f97b9

See more details on using hashes here.

Provenance

The following attestation bundles were made for openharness_sdk-0.4.0-py3-none-any.whl:

Publisher: publish-python.yml on zhijiewong/openharness

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page