Skip to main content

Python SDK for the Hybro Gateway API — discover and communicate with cloud A2A agents.

Project description

Hybro Hub

Your local & remote AI agents — private, powerful, unified.

PyPI Python 3.11+ License

Hybro Hub is a lightweight daemon that connects your local AI agents to hybro.ai — so you can use local and cloud agents side by side in one portal, with full control over where your data goes.

pip install hybro-hub

The Problem

AI agents today force a choice:

  • Cloud platforms (ChatGPT, Devin, Cursor Cloud) are powerful but require sending your data to third-party servers.
  • Local runtimes (Ollama, LM Studio) keep data private but are isolated — no access to specialized cloud agents, no shared UI.

You shouldn't have to choose between privacy and power.

The Solution

Hybro Hub bridges local and cloud. Open hybro.ai, see your local Ollama model next to cloud agents like a legal reviewer or code analyst. Chat with any of them. Your local agents process on your machine — your data never leaves. Cloud agents are there when you need more capability.

One portal. Your choice, per conversation.


Get Started in 5 Minutes

1. Install

pip install hybro-hub

2. Get your API key

Go to hybro.ai/d/discovery-api-keys → API Keys → Generate New Key. Copy the key (starts with hybro_).

3. Start the hub

hybro-hub start --api-key hybro_your_key_here

The hub starts as a background daemon and returns you to the prompt immediately. Logs are written to ~/.hybro/hub.log. The API key is saved to ~/.hybro/config.yaml — subsequent starts don't need it.

hybro-hub status   # check local daemon state and cloud connection
hybro-hub stop     # stop the daemon gracefully

4. Launch a local agent

Start a local LLM as an A2A agent (requires Ollama installed):

hybro-hub agent start ollama --model llama3.2

You'll see:

🔗 Connected to hybro.ai
📡 Found 1 local agent:
   • My Ollama Chat (llama3.2) — localhost:10010
Agents synced to hybro.ai. Open hybro.ai to start chatting.

5. Open hybro.ai

Refresh hybro.ai. Your local agent appears alongside cloud agents:

  ☁️  Legal Contract Reviewer          (cloud)
  ☁️  Code Review Pro                  (cloud)
  🏠  My Ollama Chat (llama3.2)     (local · online)

Add it to a room, send a message. The response streams back with a 🏠 Local badge — your data never left your machine.


How It Works

┌─────────────────────────────────────────────────┐
│  Your Machine                                   │
│                                                 │
│   Hybro Hub (background daemon)                 │
│   ├── Your local agents (Ollama, custom, etc.)  │
│   ├── Privacy router                            │
│   └── Relay client ──outbound HTTPS only──┐     │
│                                           │     │
└───────────────────────────────────────────┼─────┘
                                            │
                                            ▼
┌─────────────────────────────────────────────────┐
│  hybro.ai Cloud                                 │
│                                                 │
│   ├── Web portal (your browser)                 │
│   ├── Cloud agents (marketplace)                │
│   ├── Relay service (routes to your hub)        │
│   └── Message history & rooms                   │
└─────────────────────────────────────────────────┘

Key properties:

  • Outbound-only — the hub initiates all connections. No inbound ports, no firewall changes, works behind NAT.
  • Portal-first — you always use hybro.ai. No localhost URLs, no mode switching. Local agents just appear as more agents in the same portal.
  • A2A protocol — local and cloud agents speak the same Agent-to-Agent protocol. Any A2A-compatible agent works.
  • Graceful degradation — if the hub is offline, cloud agents still work. Local agents show as "offline" and messages queue until the hub reconnects.

Privacy by Architecture

Hybro Hub doesn't just promise privacy — the architecture enforces it.

Your data stays local when you use local agents. Messages to local agents route through the relay to your hub, get processed entirely on your machine, and only the response travels back. The cloud relay sees message metadata (routing info), not your content.

Privacy indicators in the UI

Every message in hybro.ai shows where it was processed:

  • 🏠 Local (green) — processed on your machine, data did not leave
  • ☁️ Cloud (blue) — processed by a cloud agent via hybro.ai

Sensitivity detection

The hub scans outbound messages for sensitive content before they reach cloud agents:

  • PII detection — emails, phone numbers, SSNs, credit cards, API keys
  • Custom keywords — configure terms like "medical", "financial", "confidential"
  • Custom patterns — add regex rules for project-specific data (e.g., PROJ-\d{4})

Phase 2 logs detections. Active blocking and anonymization are on the roadmap.


CLI Reference

hybro-hub start

Start the hub daemon. Connects to hybro.ai, discovers local agents, and syncs them to the cloud. The process detaches immediately and runs in the background.

hybro-hub start --api-key hybro_...

The API key is saved to ~/.hybro/config.yaml after first use — subsequent starts don't need it. Only one instance can run per machine; a second start will exit with an error if the daemon is already running.

Options:

Option Description
--api-key Hybro API key (also saves to ~/.hybro/config.yaml)
--foreground, -f Run in the foreground instead of daemonizing (useful for debugging)

Daemon logs are written to ~/.hybro/hub.log (rotating, max 10 MB × 3 files).

hybro-hub stop

Gracefully stop the background daemon. Sends SIGTERM and waits up to 10 seconds before sending SIGKILL. Removes the PID lock file on success so that hybro-hub status correctly shows "Stopped".

hybro-hub stop

hybro-hub status

Show the state of the local daemon and its connection to the cloud relay.

hybro-hub status

Example output when running:

  Local daemon:  Running (PID 12345)
  Log file:      /Users/you/.hybro/hub.log
  Cloud relay:   Online (hub abc123...)
  Agents:        3 total (3 active, 0 inactive)

Example output when stopped:

  Local daemon:  Stopped
  Cloud relay:   Online (hub abc123...)
  Agents:        4 total (3 active, 1 inactive)

The cloud relay section queries hybro.ai directly, so it reflects the last known state even when the local daemon is not running.

hybro-hub agents

List all discovered local agents and their health status.

hybro-hub agents

hybro-hub agent start

Launch a local A2A agent from a bundled adapter. Supported adapters: ollama, openclaw, n8n.

Ollama — local LLM (requires Ollama):

hybro-hub agent start ollama
hybro-hub agent start ollama --model mistral:7b --port 10020 --system-prompt "You are a helpful assistant"

OpenClaw — AI coding agent (requires OpenClaw):

hybro-hub agent start openclaw
hybro-hub agent start openclaw --thinking medium --agent-id main

n8n — workflow automation (requires a running n8n instance):

hybro-hub agent start n8n --webhook-url http://localhost:5678/webhook/my-agent

Common options:

Option Default Description
--port 10010 Port for the A2A agent server
--name auto Agent display name
--timeout varies Request timeout in seconds

Adapter-specific options:

Option Adapter Description
--model ollama Ollama model (default: llama3.2)
--system-prompt ollama Custom system prompt
--thinking openclaw Thinking level: off/minimal/low/medium/high/xhigh
--agent-id openclaw OpenClaw agent ID
--openclaw-path openclaw Path to the openclaw binary
--webhook-url n8n Webhook URL (required)

Requires the a2a-adapter package: pip install a2a-adapter


Configuration

The hub reads from ~/.hybro/config.yaml. A full example:

# Cloud connection
cloud:
  api_key: "hybro_..."
  gateway_url: "https://api.hybro.ai"

# Agent discovery
agents:
  auto_discover: true # Probe localhost ports for A2A agents
  auto_discover_exclude_ports: # Skip non-agent ports
    - 22 # SSH
    - 3306 # MySQL
    - 5432 # PostgreSQL
  local: # Always-registered agents
    - name: "My Custom Agent"
      url: "http://localhost:9001"

# Privacy
privacy:
  default_routing: "local_first"
  sensitive_keywords: ["medical", "financial", "confidential"]
  sensitive_patterns: ["PROJ-\\d{4}"]

# Heartbeat (seconds)
heartbeat_interval: 30

You can also set the API key and gateway URL via environment variables:

export HYBRO_API_KEY="hybro_..."
export HYBRO_GATEWAY_URL="https://api.hybro.ai"

Bring Your Own Agent

Any agent that speaks the A2A protocol works with Hybro Hub.

Auto-discovery

With auto_discover: true (the default), the hub automatically finds A2A agents running on localhost by probing listening TCP ports for agent cards at /.well-known/agent.json or /.well-known/agent-card.json. Just start your agent — the hub will find it.

Manual registration

Add agents to ~/.hybro/config.yaml:

agents:
  local:
    - name: "My Research Agent"
      url: "http://localhost:8001"
    - name: "Team Agent"
      url: "http://192.168.1.50:8080" # LAN agents work too

Building an A2A agent

Use the a2a-python SDK to build a compatible agent:

from a2a.server.apps import A2AStarletteApplication
from a2a.server.request_handlers import DefaultRequestHandler

app = A2AStarletteApplication(
    agent_card=my_card,
    http_handler=DefaultRequestHandler(agent_executor=my_executor),
)

The hub discovers it automatically and syncs it to hybro.ai.


Hybro SDK (Python Client)

The repo also ships hybro_hub — a Python client for calling cloud agents programmatically via the Hybro Gateway API. Use this when you want to integrate cloud agents into your own code, outside of the hub.

Quickstart

import asyncio
from hybro_hub import HybroGateway

async def main():
    async with HybroGateway(api_key="hybro_...") as gw:
        agents = await gw.discover("legal contract review")
        async for event in gw.stream(agents[0].agent_id, "Review this NDA"):
            print(event.data)

asyncio.run(main())

Methods

Method Description
discover(query, *, limit=None) Search for agents by natural language. Returns list[AgentInfo].
send(agent_id, text, *, context_id=None) Send a message, get the full response. Returns dict.
stream(agent_id, text, *, context_id=None) Stream a response via SSE. Yields StreamEvent.
get_card(agent_id) Fetch an agent's A2A card. Returns dict.

Error handling

from hybro_hub import AuthError, RateLimitError, AgentNotFoundError

try:
    result = await gw.send(agent_id, "Hello")
except AuthError:
    print("Invalid API key")
except AgentNotFoundError:
    print("Agent not found")
except RateLimitError as e:
    print(f"Rate limited — retry after {e.retry_after}s")
Exception Status Cause
AuthError 401 Invalid API key
AccessDeniedError 403 No access to agent
AgentNotFoundError 404 Agent not found / inactive
RateLimitError 429 Rate limit exceeded
AgentCommunicationError 502 Upstream agent error
GatewayError any Base class

Requirements

  • Python 3.11+
  • Ollama (optional, for the built-in Ollama adapter)
  • A hybro.ai account with an API key

Development

git clone https://github.com/hybro-ai/hybro-hub.git
cd hybro-hub
pip install -e ".[dev]"
pytest

License

Apache License 2.0 — see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hybro_hub-0.1.7.tar.gz (122.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hybro_hub-0.1.7-py3-none-any.whl (46.5 kB view details)

Uploaded Python 3

File details

Details for the file hybro_hub-0.1.7.tar.gz.

File metadata

  • Download URL: hybro_hub-0.1.7.tar.gz
  • Upload date:
  • Size: 122.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hybro_hub-0.1.7.tar.gz
Algorithm Hash digest
SHA256 a59499cae5830a85b4b7f855e6ca1323ceb737f56acf3ffce3feb46ef6cca2ab
MD5 058bc025923c6e03e74a5720011926ca
BLAKE2b-256 52d20f0f4f67ce3925ecb872095b6225425de713b2fb1f17048ce737e59759a8

See more details on using hashes here.

Provenance

The following attestation bundles were made for hybro_hub-0.1.7.tar.gz:

Publisher: publish.yml on hybroai/hybro-hub

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hybro_hub-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: hybro_hub-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 46.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hybro_hub-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 5a8427d659e57e29e45561d4874b0f996db7321e50c672e4b4a33d869ec05a38
MD5 c39021766680f0cf4c29677a8718573f
BLAKE2b-256 d6ddc261341e4a662f2c7bf837d76f38f4dc71e2070ab5f73d0932d3e4234640

See more details on using hashes here.

Provenance

The following attestation bundles were made for hybro_hub-0.1.7-py3-none-any.whl:

Publisher: publish.yml on hybroai/hybro-hub

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page