The Human-AI Collaboration Protocol (HCP)
Project description
Human-AI Collaboration Protocol (HCP)
What is the Human-AI Collaboration Protocol (HCP)?
Just as MCP (Model Context Protocol) standardizes how AI connects to Systems (Databases, Tools), HCP standardizes how AI connects to Humans.
1. The Problem: Asymmetric Collaboration
Today’s human-AI collaboration imposes a fundamental asymmetry: human asks, and AI responds; or human issues commands, and AI responds. This architecture worked when AI capabilities were limited, but it creates a collaboration ceiling as agents become more capable.
The bottleneck is twofold. Architecturally, half of the collaboration action space is missing by design. Communicatively, the interaction is forced through a low-bandwidth, ambiguous channel: unstructured chat.
A. The Input Ambiguity
When an Agent requires a specific parameter to proceed—such as a confidence integer or a boolean confirmation—it is currently forced to ask via open-ended text.
- Agent: "Please set the confidence level (0-100)."
- User: "Just make it safe."
- Result: The Agent must rely on error-prone parsing or hallucination to interpret the input.
B. The Missing Reverse Gear
Current interfaces assume agents only respond to human requests, but as agents become more capable, they need the reverse, such as to delegate research tasks back to humans (“I need domain expertise on X”), request strategic decisions (“Should we optimize for speed or accuracy?”), or provide suggestions (“I found three approaches - I suggest trying them one by one”).
Today, all these cases are forced through chat, which is both ambiguous (the agent cannot guarantee how the human will interpret or format their response) and stateless (the agent has no way to specify whether this is blocking or background, or to track resolution status).
2. The Solution: Symmetric Data Topology
HCP does not define UI.
It treats Input and Output as the same mathematical object: The DataShape. The Agent defines the container (The Shape); the Client adapts its rendering to fill it.
Utilizing a principle of Symmetry, the Agent defines the Shape, and the Direction determines whether it is a "Read" or "Write" operation.
This guarantees that the Agent works across any substrate—Web, VR, Mobile, or Brain-Computer Interface—without code changes.
The Four Fundamental Shapes
| Shape | The Physics | Web Rendering (Example) | VR Rendering (Example) |
|---|---|---|---|
| Discrete | Selection from a finite set (Entropy). | Dropdown / Radio / Badge | Floating Orbs |
| Continuous | Magnitude within a range (Scalar). | Slider / Gauge | A Dial / Throttle |
| Symbolic | Raw encoded data (MIME-typed). | Text Field / File Upload | Dictation / Hologram |
| Composite | Recursive structure. | Form / Dashboard | Control Panel |
Because HCP defines Topology (Physics) rather than Widgets (Pixels), the Agent becomes substrate-independent. The exact same Python code that requests a "Vector Field" can manifest as:
- A File Uploader on a 2D Web Browser.
- A Holographic Cube in an Unreal Engine VR simulation.
- A Neural Input in a future Brain-Computer Interface.
The Agent acts as the Director; the Client acts as the Renderer.
3. Installation
pip install git+https://github.com/Stanfording/HCP.git
4. Usage: The Adapter Pattern
Integration Guide
HCP is designed as a Stateless Mixin. You do not rewrite your Agent; you simply inherit HCPMixin to give it the ability to speak "Topology" instead of just text.
Installation
pip install git+https://github.com/Stanfording/HCP.git
The Architecture: "The Forward Pass"
We borrow the design pattern from PyTorch.
think(): Your Agent generates the Structure (Shape + Physics).hcp_forward(): The SDK validates and wraps that structure into the Protocol.
Example: A Dynamic Research Agent
In this example, the LLM determines the Shape and the Vector. The Python code is just a pass-through adapter.
from hcp import HCPMixin
class AgenticResearch(HCPMixin):
# 1. The Thinking Layer
def think(self, context):
"""
The LLM outputs a structured decision (JSON).
It decides: "I need a Slider" (Shape) and "It is urgent" (Vector).
"""
# ... (LLM Prompting Logic) ...
# Simulating LLM Output:
return {
"decision": "request_input",
"topology": "CONTINUOUS", # <--- LLM chose the Shape
"params": {
"label": "Risk Tolerance",
"min": 0, "max": 100,
"unit": "%"
},
"physics": {
"rationale": "I need to calibrate risk.",
"criticality": 0.8,
# LLM decided this is a "Productivity" task
"vector": {"productivity": 1.0, "information": 0.5, "affect": 0.1}
}
}
# 2. The HCP Layer (The Dynamic Adapter)
def hcp_forward(self, thought):
# Case A: Agent wants to Request Data (Pull)
if thought["decision"] == "request_input":
# The SDK method is selected dynamically based on LLM choice
if thought["topology"] == "CONTINUOUS":
return self.hcp_request_continuous(
# Pass the LLM's reasoning directly to the Protocol
rationale=thought["physics"]["rationale"],
vector=thought["physics"]["vector"],
criticality=thought["physics"]["criticality"],
# Pass the LLM's shape parameters
**thought["params"]
)
elif thought["topology"] == "DISCRETE":
return self.hcp_request_discrete(
rationale=thought["physics"]["rationale"],
vector=thought["physics"]["vector"],
**thought["params"]
)
#...
# Case B: Agent wants to Offer Data (Push)
if thought["decision"] == "offer_data":
# ... similar logic for hcp_offer_symbolic ...
pass
# 3. The Loop
def run(self, input_text):
thought = self.think(input_text)
payload = self.hcp_forward(thought)
return payload
The Client Response (The Loop)
When the Client responds, hcp_resolve performs two tasks:
- Type Guarding: Decodes the payload into a usable Python value.
- Data Logging: Emits the "Taste Signal" via
hcp_on_log.
# Inside your Agent Class...
def on_response(self, payload):
# 1. Resolve & Log
# This automatically decodes the JSON and calls hcp_on_log()
data, usedProposal = self.hcp_resolve(payload)
# 2. Feed back into the Brain
# The Agent doesn't need complex logic here.
# Just feed the result (Data OR Text) back into the context.
if usedProposal:
# Alignment: User provided structured data (80.0)
self.memory.add(role="user", content=f"User set value to {data}")
else:
# Friction: User typed "Just make it safe"
self.memory.add(role="user", content=data)
# 3. Continue the Loop
self.step()
Optional: Connecting the Database
You can override hcp_on_log to save the training data.
def hcp_on_log(self, log_entry):
"""
This hook runs automatically every time hcp_resolve() is called.
log_entry contains: { refId, rationale, usedProposal, value, timestamp }
"""
# Example: Save to your JSONL training file
with open("taste_dataset.jsonl", "a") as f:
f.write(json.dumps(log_entry) + "\n")
5. The Architecture: Why Use HCP?
A. The Recursive Guarantee (Elastic Fallback)
The true power of HCP lies in the Composite Shape. Because shapes are recursive (a Composite can contain Composites), we can model complex reality structures—like a Physics Simulation or a Medical Diagnosis—without worrying about the Client's capabilities.
This enables Elastic Fallback, arguably the most important feature for long-term robustness.
The "3D Gizmo" Problem
Imagine an Agent requesting a 3D Vector (X, Y, Z) to control a robotic arm.
-
The High-Fidelity Client (e.g., Apple Vision Pro): Recognizes the semantic label "3D Vector." It renders a holographic manipulation gizmo floating in space. The user grabs it and moves it.
-
The Low-Fidelity Client (e.g., A Text Terminal): Does not have a "holographic gizmo." In a standard protocol, the app would crash.
-
The HCP Client: It looks at the Composite Shape and performs Molecular Decomposition. It breaks the complex molecule down into its atomic parts until it finds a shape it can render.
-
GUI Client (Web): Renders three sleek 1D Sliders.
-
XR Client (Vision Pro): Renders three floating holographic dials.
-
TUI Client (Terminal): Renders three ASCII sliders controlled by Arrow Keys.
X-Axis: [<-- ███░░░░░░░ -->] 30%Y-Axis: [<-- ██████░░░░ -->] 60%Z-Axis: [<-- ████░░░░░░ -->] 40%
-
The Result: Indestructible Interaction
The Agent gets its data (X, Y, Z) regardless of the substrate.
Crucially, Type Safety is preserved. Even in the Terminal, the user cannot type "hello"; the interface restricts them to a Float within the 0.0 - 1.0 range.
Through the Composite DataShape, we can scale from the simplest interaction (asking for a boolean) to the most complex (orchestrating a fluid dynamics simulation in the Metaverse) without changing a single line of the Agent's reasoning logic.
HCP is not just versatile; it is future-proof.
B. The "Co-Up Memory"
Just as Cursor/Copilot learns code completion by observing whether you hit Tab (Accept) or keep typing (Reject), HCP records whether users engage with the UI or fall back to Chat.
However, HCP data is richer than code completion:
- Accept -> Agent's rationale explains why.
- Reject -> User's new prompt explains why.
With a history of these logs, every interaction is a high-fidelity training signal.
- No need to collect user's engagement depth: vibe coder vs. technical coder.
- No need to collect user's personality: Steve Jobs vs. hacker.
- No need to collect user's favorite colors.
- No need to collect user's go-to moves.
A simple accept/reject + rationale tells it all.
6. Advanced Scenarios: The "Crazy" Tests
Because HCP decouples Parameters (Topology) from Rendering (Client), it can drive complex environments like Physics Engines or XR Simulations without the AI Agent needing to know graphics programming.
Test Case: The 3D Water Simulation
Concept: State Mutation (The "Living System").
Sometimes, an Agent needs the user to intervene in a running simulation. The Agent provides the Initial Conditions (Entropy), and the User provides the External Force.
- Scenario: The Agent initializes a particle system with random momentum. It asks the user to "sculpt" the flow using their hands.
- Mechanism:
REQUESTwith avaluecontaining the full particle state.
The Agent Code
import random
class PhysicsDirector(HCPMixin):
def initialize_simulation(self):
# 1. The Agent creates the "Cup" (10,000 Water Molecules)
# It assigns random XYZ positions and Force Vectors to simulate entropy.
particles = []
for _ in range(10000):
particles.append({
"pos": [random.uniform(0, 10) for _ in range(3)],
"vel": [random.uniform(-1, 1) for _ in range(3)]
})
return self.hcp_request_symbolic(
rationale="User must shape the fluid flow manually.",
label="Sculpt the Water",
# The Client uses this to boot up the Physics Engine (Unity/Unreal/WebGL)
mimeType="application/x-particle-system+json",
# THE HYBRID TRICK:
# We send the LIVE state.
value={
"physics_constants": { "viscosity": 0.95, "gravity": -9.8 },
"particles": particles
}
)
The Client Experience
- Render: The Client receives the state. It spawns 10,000 particles.
- Simulation: The Client's Physics Engine takes over. The particles immediately start moving and colliding based on the Agent's initial velocity vectors.
- Interaction: The user sticks their hand (or mouse) into the simulation. The hand acts as a Collider, displacing water and altering velocity vectors in real-time.
- Return: When the user hits "Submit," the Client freezes the simulation and returns the New Snapshot of the particle array.
The Agent acts as the Director (Setting the Scene); the Client acts as the Renderer.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hcp-0.1.0.tar.gz.
File metadata
- Download URL: hcp-0.1.0.tar.gz
- Upload date:
- Size: 20.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.2.1 CPython/3.14.2 Darwin/25.0.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a219e459ebf02d4a169ff7b028d7021ca1d93fe1a8eac4042dd739bc1cb50bad
|
|
| MD5 |
89626211285287b1584916c82b11ee1f
|
|
| BLAKE2b-256 |
047f5f4d77fbbd25a8c23bdf8f96146e7ba6a9dc055ac5d4b82fbe3c5a8a75f4
|
File details
Details for the file hcp-0.1.0-py3-none-any.whl.
File metadata
- Download URL: hcp-0.1.0-py3-none-any.whl
- Upload date:
- Size: 18.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.2.1 CPython/3.14.2 Darwin/25.0.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
40619de677948d9f8e37254a5e0e2f9cdf75dd94be49d94006e9b2a8def0c14f
|
|
| MD5 |
4abc9e10a97cb34001eafc37773bab87
|
|
| BLAKE2b-256 |
ad27362e50a763caa9870f0936d7664afa60143ade4674f888dc2d3f8a3f5f13
|