HAL server implementation for Jetson with integrated inference
Project description
Krabby HAL Server - Jetson
HAL server implementation for Jetson robot deployment with integrated parkour policy inference.
Overview
This package provides an entry point that runs both the Jetson HAL server and parkour inference client in the same process using inproc ZMQ for zero-copy communication.
Architecture
┌─────────────────────────────────────────┐
│ Jetson Process │
│ │
│ ┌──────────────┐ inproc (ZMQ) │
│ │ HAL Server │◄──────────────────┐ │
│ │ (main thread)│ │ │
│ │ - ZED camera│ │ │
│ │ - Sensors │ │ │
│ │ - Actuators │ │ │
│ └──────────────┘ │ │
│ │ │ │
│ │ publishes observations │ │
│ │ receives commands │ │
│ │ │ │
│ ┌──────────────────────────────────┴─┐│
│ │ Parkour Inference Client ││
│ │ (separate thread) ││
│ │ - Polls observations ││
│ │ - Runs policy inference ││
│ │ - Sends joint commands ││
│ └────────────────────────────────────┘│
└─────────────────────────────────────────┘
Installation
From source (development)
cd hal/server/jetson
pip install -e .
With optional dependencies
pip install -e ".[dev]"
Usage
Command line
After installation, use the krabby-hal-server-jetson command:
krabby-hal-server-jetson \
--checkpoint /path/to/model.pt \
--control_rate 100.0 \
--device cuda
Python module
python -m hal.server.jetson.main --checkpoint /path/to/model.pt
Arguments
Required:
--checkpoint: Path to model checkpoint file
Optional:
--control_rate: Control loop rate in Hz (default: 100.0)--device: Device for inference,cudaorcpu(default: cuda)--observation_bind: Observation endpoint (default:inproc://hal_observation)--command_bind: Command endpoint (default:inproc://hal_commands)
Components
HAL Server (hal.server.jetson.JetsonHalServer)
- Integrates with ZED camera for depth perception
- Interfaces with real sensors (IMU, encoders)
- Applies commands to actuators (motors)
- Publishes observations via ZMQ PUB socket
- Receives joint commands via ZMQ PULL socket
Parkour Inference Client (compute.parkour.inference_client.ParkourInferenceClient)
- Runs in separate thread
- Polls observations from HAL server
- Runs parkour policy inference
- Sends joint commands back to HAL server
Hardware Requirements
- NVIDIA Jetson (Orin, AGX Xavier, or compatible)
- ZED Camera (requires ZED SDK and pyzed)
- Robot Hardware (motors, IMU, encoders)
Development
Project Structure
hal/server/jetson/
├── pyproject.toml # Package configuration
├── README.md # This file
├── __init__.py # Package init
├── main.py # Entry point with integrated inference
└── hal_server.py # JetsonHalServer implementation
Running Tests
pytest tests/integration/test_jetson_hal.py
Note: Most tests require Jetson hardware or ZED SDK and are skipped in x86 environments.
Notes
- This package uses inproc ZMQ by default for same-process communication (zero-copy, high performance)
- For distributed deployment, use TCP endpoints instead
- The parkour inference client runs on a separate thread to avoid blocking the sensor loop
- Camera, sensors, and actuators are initialized during startup
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file krabby_hal_server_jetson-0.1.1-py3-none-any.whl.
File metadata
- Download URL: krabby_hal_server_jetson-0.1.1-py3-none-any.whl
- Upload date:
- Size: 17.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4b13c22e712dc3fe20916bfe9066a6e6b9ff60d057f792869c2e8b5dc2edd9e8
|
|
| MD5 |
08d8c9ce7978be0291bfc68f8bc6c3c5
|
|
| BLAKE2b-256 |
2ea56971b362a2cc91691c90d0809df6f1ccb0bd40af613916567dda9f857f80
|