Skip to main content

Multi-Agent Orchestration Framework for Python

Project description

orxhestra logo

Multi-agent orchestration framework for Python — with a built-in coding CLI.

PyPI Python License


Compose multi-agent AI systems with async event streaming, agent hierarchies, and built-in support for MCP and A2A protocols.

Orx CLI

A pre-built coding agent in your terminal — similar to Claude Code or Cursor — powered by any LLM. One install command and you're up and running.

pip install orxhestra[cli,openai]
orx
+-- orx - terminal coding agent ------------------------------------+
|  model: gpt-5.4   workspace: ~/my-project   /help for commands    |
+-------------------------------------------------------------------+

orx> add error handling to the API routes

  > read_file(src/api/routes.py)
  > grep(pattern="raise", path=src/api/)
  > write_todos(3 tasks)

  Tasks
  * Add try/except to all route handlers  [in progress]
  - Add custom error response model
  - Write tests for error cases

  > edit_file(src/api/routes.py)
  > shell_exec(pytest tests/test_api.py)
  4 passed

  Done - added structured error handling to all 4 route handlers
  with a custom ErrorResponse model. All tests pass.

Features

  • Any LLM — OpenAI, Anthropic, Google via --model gpt-5.4 / claude-sonnet-4-6 / gemini-2.0-flash
  • Streaming — real-time token rendering with Markdown formatting
  • Tool approval — prompts before destructive operations (write, edit, shell)
  • Task planning — structured todo lists visible in the terminal
  • Sub-agent delegation — spawn isolated agents for complex subtasks
  • AGENTS.md memory — persistent project context across sessions
  • Local context injection — auto-detects language, git state, package manager, project tree
  • Context summarization — auto-compacts long conversations, /compact command
  • Orx YAML — run any orx.yaml agent team: orx my-agents.yaml

Usage

orx                               # interactive REPL (default model)
orx --model claude-sonnet-4-6     # use a specific model
orx -c "fix the failing tests"    # single-shot command
orx my-agents.yaml                # run a custom orx file
orx --auto-approve                # skip approval prompts

Commands

Command Description
/model <name> Switch model mid-session
/clear Reset conversation
/compact Summarize old messages to free context
/todos Show current task list
/help Show all commands
/exit Exit

Quickstart (SDK)

pip install orxhestra
# or
uv add orxhestra
from orxhestra import LlmAgent, Runner, InMemorySessionService

agent = LlmAgent(
    name="assistant",
    model="gpt-5.4",
    instructions="You are a helpful assistant.",
)

runner = Runner(agent=agent, session_service=InMemorySessionService())
response = await runner.run(user_id="user1", session_id="s1", new_message="Hello!")

for event in response:
    print(event.content)

[!TIP] For full documentation, guides, and API reference, visit orxhestra.com.

Features

  • Agent ensemble - LLM, ReAct, Sequential, Parallel, and Loop agents
  • Event streaming - Async event-driven architecture with real-time streaming
  • Composer - Conduct entire agent orchestras declaratively with YAML
  • Tools - Function tools, filesystem tools, agent-as-tool, shell, and long-running tool support
  • Planners - Choreograph task execution with PlanReAct and TaskPlanner strategies
  • Skills - Reusable, composable agent repertoires
  • MCP - Model Context Protocol integration for tool servers
  • A2A - Agent-to-Agent protocol for cross-service harmonization
  • Memory - Pluggable memory stores for persistent agent context
  • Tracing - Built-in support for Langfuse, LangSmith, and custom callbacks

Agents at a glance

Agent Description
LlmAgent Chat model agent with tools, instructions, and structured output
ReActAgent Reasoning + acting loop with automatic tool use
SequentialAgent Runs sub-agents in order
ParallelAgent Runs sub-agents concurrently
LoopAgent Repeats a sub-agent until exit condition
A2AAgent Connects to remote agents via A2A protocol

Composer

Define entire agent orchestras in a single YAML file — no Python wiring needed. Compose LLM agents, loops, pipelines, tools, and review cycles declaratively. The example below builds a coding agent that plans, implements with filesystem + shell access, and self-reviews in a loop:

defaults:
  model:
    provider: openai
    name: gpt-5.4

tools:
  exit:
    builtin: "exit_loop"
  filesystem:
    builtin: "filesystem"
  shell:
    builtin: "shell"

agents:
  planner:
    type: llm
    description: "Plans the implementation steps for the coder agent."
    instructions: |
      Output a numbered list of concrete steps the coder
      should execute. Each step must be an actionable file
      operation or shell command.

  coder:
    type: llm
    description: "Implements code changes with filesystem and shell access."
    instructions: |
      Follow the plan from the previous step exactly.
      Use filesystem tools to create files and shell to
      run commands. Never ask the user to do anything.
    tools:
      - filesystem
      - shell

  reviewer:
    type: llm
    description: "Reviews changes and approves or requests fixes."
    instructions: |
      Check files exist and look correct. If done, call
      exit_loop. Otherwise describe what needs fixing.
    tools:
      - exit

  dev_loop:
    type: loop
    agents: [coder, reviewer]
    max_iterations: 10

  coordinator:
    type: sequential
    agents: [planner, dev_loop]

main_agent: coordinator

runner:
  app_name: coding-agent
  session_service: memory
orx orx.yaml

Docker

docker run -e OPENAI_API_KEY=$OPENAI_API_KEY \
  -v ./orx.yaml:/app/orx.yaml \
  nicolaimtlassen/orxhestra

Documentation


Acknowledgments

This project is built on the shoulders of several outstanding open-source projects and research efforts:

Special thanks to the open-source AI community for pushing the boundaries of what's possible with agent frameworks.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

orxhestra-0.0.5.tar.gz (109.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

orxhestra-0.0.5-py3-none-any.whl (124.1 kB view details)

Uploaded Python 3

File details

Details for the file orxhestra-0.0.5.tar.gz.

File metadata

  • Download URL: orxhestra-0.0.5.tar.gz
  • Upload date:
  • Size: 109.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for orxhestra-0.0.5.tar.gz
Algorithm Hash digest
SHA256 f7ecebff25e892b7fee7e7f36d2f305fe795dd85b2bcc10e3cf0932730c93db2
MD5 49657b49a5cf8c6b523691d51cf0ee93
BLAKE2b-256 49d3a7409840f712a7ffd0770608955251b35863ca1aa941d8dd30f83a7772c5

See more details on using hashes here.

Provenance

The following attestation bundles were made for orxhestra-0.0.5.tar.gz:

Publisher: publish.yml on NicolaiLassen/orxhestra

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file orxhestra-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: orxhestra-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 124.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for orxhestra-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 faca3757420b7495e268651e0e79c933b63bf45fdeae6daa3eaba461aa5a1d3a
MD5 d066b21d109f4720c2dab1bca4b23ec3
BLAKE2b-256 341810b7b2ce308f0f142d31002bd23e861e5a4cd3fa2f282fdb1cc6e3a21eb4

See more details on using hashes here.

Provenance

The following attestation bundles were made for orxhestra-0.0.5-py3-none-any.whl:

Publisher: publish.yml on NicolaiLassen/orxhestra

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page