Parkour policy inference client and utilities for Krabby
Project description
Krabby Compute - Parkour
Parkour policy inference client and utilities for the Krabby quadruped robot.
Overview
This package provides the parkour policy inference client that:
- Connects to HAL server via ZMQ
- Polls hardware observations
- Runs parkour policy inference
- Sends joint commands back to HAL server
Components
Inference Client (compute.parkour.inference_client.ParkourInferenceClient)
- Runs in a separate thread
- Manages HAL client connection
- Handles inference loop (poll → infer → command)
Policy Interface (compute.parkour.policy_interface.ParkourPolicyModel)
- Loads parkour policy checkpoints
- Runs inference on observations
- Uses OnPolicyRunnerWithExtractor for model loading
Mappers (compute.parkour.mappers)
- hardware_to_model: Maps Krabby hardware observations to parkour model format
- model_to_hardware: Maps parkour model actions to Krabby joint positions
Types (compute.parkour.parkour_types)
ParkourObservation: Observation in training formatParkourModelIO: Combined input for policy inferenceInferenceResponse: Policy inference output with action tensor
Installation
From source (development)
cd compute/parkour
pip install -e .
From wheel
pip install krabby-compute-parkour-0.1.0-py3-none-any.whl
Usage
As a library
from compute.parkour.inference_client import ParkourInferenceClient
from compute.parkour.policy_interface import ModelWeights
from hal.client.config import HalClientConfig
# Configure HAL client
hal_config = HalClientConfig(
observation_endpoint="inproc://hal_observation",
command_endpoint="inproc://hal_commands",
)
# Configure model
model_weights = ModelWeights(
checkpoint_path="/path/to/model.pt",
action_dim=12,
obs_dim=753,
)
# Create client
client = ParkourInferenceClient(
hal_client_config=hal_config,
model_weights=model_weights,
control_rate=100.0,
device="cuda",
transport_context=transport_context, # From HAL server
)
# Initialize and start
client.initialize()
client.start_thread(running_flag=lambda: True)
Architecture
┌──────────────────────────────────────┐
│ ParkourInferenceClient │
│ │
│ ┌────────────┐ ┌──────────────┐ │
│ │ HAL Client │───▶│ Policy Model │ │
│ └────────────┘ └──────────────┘ │
│ │ │ │
│ │ observations │ actions │
│ ▼ ▼ │
│ ┌──────────────────────────────┐ │
│ │ Hardware ↔ Model Mappers │ │
│ └──────────────────────────────┘ │
└──────────────────────────────────────┘
Dependencies
krabby-hal-client: For HAL communicationtorch: For policy inferencenumpy: For numerical operations
Development
Running Tests
pytest tests/
Building Wheel
python -m build
Notes
- Designed to run in a separate thread from HAL server
- Supports both inproc (same-process) and TCP (distributed) communication
- Zero-copy operations where possible for performance
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file krabby_compute_parkour-0.1.0-py3-none-any.whl.
File metadata
- Download URL: krabby_compute_parkour-0.1.0-py3-none-any.whl
- Upload date:
- Size: 55.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5319473b1abe945b2451c09a3ff13a177b2ae9f76f926b988b05b3fab6d07bd5
|
|
| MD5 |
6ae0081cb8acea8b5ac7085d5647be81
|
|
| BLAKE2b-256 |
75080b6a48a03ca08e4e5849f85f036ff3aafde43d176a5de60d2442665175ae
|