Skip to main content

Python SDK for openHarness — drive the `oh` terminal coding agent from Python.

Project description

openharness — Python SDK for openHarness

Drive the oh terminal coding agent from Python. Stream tokens and tool calls, control permissions and models, all with a small async API.

Prerequisite

Install the oh CLI first (via npm):

npm install -g @zhijiewang/openharness

The Python SDK finds oh on PATH. To point at a specific binary, set OH_BINARY=/absolute/path/to/oh.

Install

pip install openharness-sdk

The PyPI distribution is openharness-sdk (the shorter openharness name is taken by an unrelated project). The import path remains from openharness import ....

Requires Python ≥3.10. Small dependency surface: the mcp SDK and uvicorn (pulled in to host Python-defined tools as MCP servers). Both are optional to usequery() without tools= never touches them at runtime — but they're installed eagerly for simplicity.

Quick start

import asyncio
from openharness import query, TextDelta, ToolStart, ToolEnd

async def main() -> None:
    async for event in query(
        "Summarize the README.md in this directory.",
        model="ollama/llama3",
        permission_mode="trust",
        max_turns=5,
    ):
        if isinstance(event, TextDelta):
            print(event.content, end="", flush=True)
        elif isinstance(event, ToolStart):
            print(f"\n[tool: {event.tool}]", flush=True)
        elif isinstance(event, ToolEnd):
            print(f"[{event.tool}{'error' if event.error else 'ok'}]", flush=True)

asyncio.run(main())

Multi-turn sessions

For conversations that span multiple prompts (notebooks, chatbots, agents), use OpenHarnessClient:

import asyncio
from openharness import OpenHarnessClient, TextDelta

async def main() -> None:
    async with OpenHarnessClient(model="ollama/llama3", permission_mode="trust") as client:
        async for event in await client.send("What is 1+1?"):
            if isinstance(event, TextDelta):
                print(event.content, end="")
        print()
        async for event in await client.send("And times 3?"):  # remembers the prior turn
            if isinstance(event, TextDelta):
                print(event.content, end="")

asyncio.run(main())

The client keeps a single oh session subprocess warm across calls, preserving conversation state in-process. Concurrent send() calls on one client are serialized via an asyncio.Lock. Call close() (or exit the async context) to terminate the subprocess.

Custom Python tools

Expose your own Python functions to the agent. Decorate with @tool, then pass the callables via tools=[...] on either query() or OpenHarnessClient:

import asyncio
from openharness import OpenHarnessClient, ToolEnd, tool


@tool
async def get_weather(city: str) -> str:
    """Fetch the current weather for a city."""
    return f"Sunny in {city}, 22°C"


async def main() -> None:
    async with OpenHarnessClient(
        model="ollama/llama3",
        tools=[get_weather],
    ) as client:
        async for event in await client.send("What's the weather in Paris?"):
            if isinstance(event, ToolEnd):
                print(event.tool, event.output)


asyncio.run(main())

Under the hood the SDK spins up an in-process MCP HTTP server on a random 127.0.0.1 port, writes an ephemeral .oh/config.yaml pointing at it, and runs oh with that temp dir as its cwd. Any existing user config at the caller-supplied cwd= is preserved.

Use @tool(name="custom-name", description="…") to override the auto-inferred metadata. Sync and async functions both work.

API

query(prompt, **options) -> AsyncIterator[Event]

Run a single prompt and stream events as they arrive. Options:

Option Type Default Description
model str | None from config Model string (e.g. "ollama/llama3", "claude-sonnet-4-6").
permission_mode str "trust" One of "ask", "trust", "deny", "acceptEdits", "plan", "auto", "bypassPermissions".
allowed_tools Sequence[str] | None None Whitelist of tool names.
disallowed_tools Sequence[str] | None None Blacklist of tool names.
max_turns int 20 Maximum number of model turns.
system_prompt str | None None Override the default system prompt.
cwd str | None current dir Working directory for the spawned CLI.
env dict[str, str] | None None Env vars merged on top of os.environ.
tools Sequence[Callable] | None None Python callables (optionally @tool-decorated) to expose to the agent via an in-process MCP server.

Event types

All events are frozen dataclasses. Use isinstance to discriminate.

  • TextDelta(content: str) — streaming text from the assistant
  • ToolStart(tool: str) — the assistant is about to call a tool
  • ToolEnd(tool: str, output: str, error: bool) — tool invocation finished
  • ErrorEvent(message: str) — recoverable error during the turn
  • CostUpdate(input_tokens: int, output_tokens: int, cost: float, model: str) — cost + usage
  • TurnComplete(reason: str) — one model turn ended; reason is "completed", "max_turns", "error", etc.
  • UnknownEvent(raw: dict) — forward-compatibility shim for future event types

Exceptions

  • OhBinaryNotFoundError — raised when oh can't be located on PATH or via OH_BINARY.
  • OpenHarnessError — raised when the subprocess exits non-zero. Has .stderr and .exit_code attributes.

Cancellation

Standard asyncio cancellation works. The spawned subprocess is sent SIGTERM (SIGBREAK on Windows) and given up to 5 seconds to exit cleanly before being kill()ed.

task = asyncio.create_task(collect_events())
await asyncio.sleep(1)
task.cancel()

Relationship to @zhijiewang/openharness

This Python package is a thin subprocess wrapper around the oh CLI shipped by the npm package @zhijiewang/openharness. It does not re-implement the agent loop. This means:

  • You always get the latest CLI features by upgrading the npm package.
  • All providers (Anthropic, OpenAI, Ollama, OpenRouter, llama.cpp, LM Studio) work as-is.
  • All tools and MCP servers configured in .oh/config.yaml apply.
  • The Python SDK follows its own independent SemVer track (0.x series at launch).

License

MIT. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openharness_sdk-0.3.0.tar.gz (15.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openharness_sdk-0.3.0-py3-none-any.whl (19.0 kB view details)

Uploaded Python 3

File details

Details for the file openharness_sdk-0.3.0.tar.gz.

File metadata

  • Download URL: openharness_sdk-0.3.0.tar.gz
  • Upload date:
  • Size: 15.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for openharness_sdk-0.3.0.tar.gz
Algorithm Hash digest
SHA256 6062afa729f2bfd53dd7ce0c663327d36054064667fb21254dc28044ec4b6c5c
MD5 b49a2835bfa4311a6e270dd727fceed1
BLAKE2b-256 4061dc2cd8549da9a9c04271ec58753798607858183f4c122d339947b3de65b8

See more details on using hashes here.

Provenance

The following attestation bundles were made for openharness_sdk-0.3.0.tar.gz:

Publisher: publish-python.yml on zhijiewong/openharness

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file openharness_sdk-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: openharness_sdk-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 19.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for openharness_sdk-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 770d2a588a0915572e382ffe5188a395cd23fee07fd03e8a8230353d9cb4f07b
MD5 32a179798e4db9d83381e9f014842aba
BLAKE2b-256 2443ac0885dbe61dcb6b8babafd217843e686151d74ac5cc5b446e27c5d56c94

See more details on using hashes here.

Provenance

The following attestation bundles were made for openharness_sdk-0.3.0-py3-none-any.whl:

Publisher: publish-python.yml on zhijiewong/openharness

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page