Skip to main content

Robot Control with LeRobot Integration for Strands Agents

Project description

Strands Robots

Robot Control for Strands Agents

PyPI Version GitHub stars License GR00T LeRobot

Strands Docs โ—† NVIDIA GR00T โ—† LeRobot โ—† Jetson Containers

Control robots with natural language through Strands Agents. Integrates NVIDIA Isaac GR00T for vision-language-action policies and LeRobot for universal robot support.

How It Works

graph LR
    A[Natural Language<br/>'Pick up the red block'] --> B[Strands Agent]
    B --> C[Robot Tool]
    C --> D[Policy Provider<br/>GR00T/Mock]
    C --> E[LeRobot<br/>Hardware Abstraction]
    D --> F[Action Chunk<br/>16 timesteps]
    F --> E
    E --> G[Robot Hardware<br/>SO-101/GR-1/G1]

    classDef input fill:#2ea44f,stroke:#1b7735,color:#fff
    classDef agent fill:#0969da,stroke:#044289,color:#fff
    classDef policy fill:#8250df,stroke:#5a32a3,color:#fff
    classDef hardware fill:#bf8700,stroke:#875e00,color:#fff

    class A input
    class B,C agent
    class D,F policy
    class E,G hardware

Architecture

flowchart TB
    subgraph Agent["๐Ÿค– Strands Agent"]
        NL[Natural Language Input]
        Tools[Tool Registry]
    end

    subgraph RobotTool["๐Ÿฆพ Robot Tool"]
        direction TB
        RT[Robot Class]
        TM[Task Manager]
        AS[Async Executor]
    end

    subgraph Policy["๐Ÿง  Policy Layer"]
        direction TB
        PA[Policy Abstraction]
        GP[GR00T Policy]
        MP[Mock Policy]
        CP[Custom Policy]
    end

    subgraph Inference["โšก Inference Service"]
        direction TB
        DC[Docker Container]
        ZMQ[ZMQ Server :5555]
        TRT[TensorRT Engine]
    end

    subgraph Hardware["๐Ÿ”ง Hardware Layer"]
        direction TB
        LR[LeRobot]
        CAM[Cameras]
        SERVO[Feetech Servos]
    end

    NL --> Tools
    Tools --> RT
    RT --> TM
    TM --> AS
    AS --> PA
    PA --> GP
    PA --> MP
    PA --> CP
    GP --> ZMQ
    ZMQ --> TRT
    TRT --> DC
    AS --> LR
    LR --> CAM
    LR --> SERVO

    classDef agentStyle fill:#0969da,stroke:#044289,color:#fff
    classDef robotStyle fill:#2ea44f,stroke:#1b7735,color:#fff
    classDef policyStyle fill:#8250df,stroke:#5a32a3,color:#fff
    classDef infraStyle fill:#bf8700,stroke:#875e00,color:#fff
    classDef hwStyle fill:#d73a49,stroke:#a72b3a,color:#fff

    class NL,Tools agentStyle
    class RT,TM,AS robotStyle
    class PA,GP,MP,CP policyStyle
    class DC,ZMQ,TRT infraStyle
    class LR,CAM,SERVO hwStyle

Quick Start

from strands import Agent
from strands_robots import Robot, gr00t_inference

# Create robot with cameras
robot = Robot(
    tool_name="my_arm",
    robot="so101_follower",
    cameras={
        "front": {"type": "opencv", "index_or_path": "/dev/video0", "fps": 30},
        "wrist": {"type": "opencv", "index_or_path": "/dev/video2", "fps": 30}
    },
    port="/dev/ttyACM0",
    data_config="so100_dualcam"
)

# Create agent with robot tool
agent = Agent(tools=[robot, gr00t_inference])

# Start GR00T inference service
agent.tool.gr00t_inference(
    action="start",
    checkpoint_path="/data/checkpoints/model",
    port=8000,
    data_config="so100_dualcam"
)

# Control robot with natural language
agent("Use my_arm to pick up the red block using GR00T policy on port 8000")

Installation

pip install strands-robots

From source:

git clone https://github.com/strands-labs/robots
cd robots
pip install -e .
๐Ÿณ Jetson Container Setup (Required for GR00T Inference)

GR00T inference requires the Isaac-GR00T Docker container on Jetson platforms:

# Clone jetson-containers
git clone https://github.com/dusty-nv/jetson-containers
cd jetson-containers

# Run Isaac GR00T container (background)
jetson-containers run $(autotag isaac-gr00t) &

# Container exposes inference service on port 5555 (ZMQ) or 8000 (HTTP)

Tested Hardware:

  • NVIDIA Thor Dev Kit (Jetpack 7.0)
  • NVIDIA Jetson AGX Orin (Jetpack 6.x)

See Jetson Deployment Guide for TensorRT optimization.

Robot Control Flow

sequenceDiagram
    participant User
    participant Agent as Strands Agent
    participant Robot as Robot Tool
    participant Policy as GR00T Policy
    participant HW as Hardware

    User->>Agent: "Pick up the red block"
    Agent->>Robot: execute(instruction, policy_port)
    
    loop Control Loop @ 50Hz
        Robot->>HW: get_observation()
        HW-->>Robot: {cameras, joint_states}
        Robot->>Policy: get_actions(obs, instruction)
        Policy-->>Robot: action_chunk[16]
        
        loop Action Horizon
            Robot->>HW: send_action(action)
            Note over Robot,HW: 20ms sleep (50Hz)
        end
    end
    
    Robot-->>Agent: Task completed
    Agent-->>User: "โœ… Picked up red block"

Tools Reference

Robot Tool

The Robot class is a Strands AgentTool that provides async robot control with real-time status reporting.

Action Parameters Description Example
execute instruction, policy_port, duration Blocking execution until complete "Pick up the cube"
start instruction, policy_port, duration Non-blocking async start "Wave your arm"
status - Get current task status Check progress
stop - Interrupt running task Emergency stop

Natural Language Examples:

# Blocking execution (waits for completion)
agent("Use my_arm to pick up the red block using GR00T policy on port 8000")

# Async execution (returns immediately)
agent("Start my_arm waving using GR00T on port 8000, then check status")

# Stop running task
agent("Stop my_arm immediately")
Robot Constructor Parameters
Parameter Type Default Description
tool_name str required Name for this robot tool
robot str|RobotConfig required Robot type or config
cameras Dict None Camera configuration
port str None Serial port for robot
data_config str None GR00T data config name
control_frequency float 50.0 Control loop Hz
action_horizon int 8 Actions per inference

GR00T Inference Tool

Manages GR00T policy inference services running in Docker containers.

Action Parameters Description Example
start checkpoint_path, port, data_config Start inference service "Start GR00T on port 8000"
stop port Stop service on port "Stop GR00T on port 8000"
status port Check service status "Is GR00T running?"
list - List all running services "List inference services"
find_containers - Find GR00T containers "Find available containers"

TensorRT Acceleration:

agent.tool.gr00t_inference(
    action="start",
    checkpoint_path="/data/checkpoints/model",
    port=8000,
    use_tensorrt=True,
    trt_engine_path="gr00t_engine",
    vit_dtype="fp8",    # ViT: fp16 or fp8
    llm_dtype="nvfp4",  # LLM: fp16, nvfp4, or fp8
    dit_dtype="fp8"     # DiT: fp16 or fp8
)

Camera Tool

LeRobot-based camera management with OpenCV and RealSense support.

Action Parameters Description Example
discover - Find all cameras "Discover cameras"
capture camera_id, save_path Single image capture "Capture from /dev/video0"
capture_batch camera_ids, async_mode Multi-camera capture "Capture from all cameras"
record camera_id, capture_duration Record video "Record 10s video"
preview camera_id, preview_duration Live preview "Preview camera 0"
test camera_id Performance test "Test camera speed"

Serial Tool

Low-level serial communication for Feetech servos and custom protocols.

Action Parameters Description Example
list_ports - Discover serial ports "List serial ports"
feetech_position port, motor_id, position Move servo "Move motor 1 to center"
feetech_ping port, motor_id Ping servo "Ping motor 1"
send port, data/hex_data Send raw data "Send FF FF to robot"
monitor port Monitor serial data "Monitor /dev/ttyACM0"

Teleoperation Tool

Record demonstrations for imitation learning with LeRobot.

Action Parameters Description Example
start robot_type, teleop_type Start teleoperation "Start teleoperation"
stop session_name Stop session "Stop recording"
list - List active sessions "List teleop sessions"
replay dataset_repo_id, replay_episode Replay episode "Replay episode 5"

Pose Tool

Store, retrieve, and execute named robot poses.

Action Parameters Description Example
store_pose pose_name Save current position "Save as 'home'"
load_pose pose_name Move to saved pose "Go to home pose"
list_poses - List all poses "List saved poses"
move_motor motor_name, position Move single motor "Move gripper to 50%"
incremental_move motor_name, delta Small movement "Move elbow +5ยฐ"
reset_to_home - Safe home position "Reset to home"

Supported Robots

Robot Config Cameras Description
SO-100/SO-101 so100, so100_dualcam, so100_4cam 1-4 Single arm desktop robot
Fourier GR-1 fourier_gr1_arms_only 1 Bimanual humanoid arms
Bimanual Panda bimanual_panda_gripper 3 Dual Franka Emika arms
Unitree G1 unitree_g1 1 Humanoid robot platform
GR00T Data Configurations
Config Video Keys State Keys Description
so100 video.webcam state.single_arm, state.gripper Single camera
so100_dualcam video.front, video.wrist state.single_arm, state.gripper Front + wrist
so100_4cam video.front, video.wrist, video.top, video.side state.single_arm, state.gripper Quad camera
fourier_gr1_arms_only video.ego_view state.left_arm, state.right_arm, state.left_hand, state.right_hand Humanoid arms
bimanual_panda_gripper video.right_wrist_view, video.left_wrist_view, video.front_view EEF pos/quat + gripper Dual arm EEF
unitree_g1 video.rs_view state.left_arm, state.right_arm, state.left_hand, state.right_hand G1 humanoid

Policy Providers

classDiagram
    class Policy {
        <<abstract>>
        +get_actions(observation, instruction)
        +set_robot_state_keys(keys)
        +provider_name
    }

    class Gr00tPolicy {
        +data_config
        +policy_client: ZMQ
        +get_actions()
    }

    class MockPolicy {
        +get_actions()
        Returns random actions
    }

    class CustomPolicy {
        +get_actions()
        Your implementation
    }

    Policy <|-- Gr00tPolicy
    Policy <|-- MockPolicy
    Policy <|-- CustomPolicy
from strands_robots import create_policy

# GR00T policy (requires inference server)
policy = create_policy(
    provider="groot",
    data_config="so100_dualcam",
    host="localhost",
    port=8000
)

# Mock policy (for testing)
policy = create_policy(provider="mock")

Project Structure

strands-robots/
โ”œโ”€โ”€ strands_robots/
โ”‚   โ”œโ”€โ”€ __init__.py              # Package exports
โ”‚   โ”œโ”€โ”€ robot.py                 # Universal Robot class (AgentTool)
โ”‚   โ”œโ”€โ”€ policies/
โ”‚   โ”‚   โ”œโ”€โ”€ __init__.py          # Policy ABC + factory
โ”‚   โ”‚   โ””โ”€โ”€ groot/
โ”‚   โ”‚       โ”œโ”€โ”€ __init__.py      # Gr00tPolicy implementation
โ”‚   โ”‚       โ”œโ”€โ”€ client.py        # ZMQ inference client
โ”‚   โ”‚       โ””โ”€โ”€ data_config.py   # Robot embodiment configurations
โ”‚   โ””โ”€โ”€ tools/
โ”‚       โ”œโ”€โ”€ gr00t_inference.py   # Docker service manager
โ”‚       โ”œโ”€โ”€ lerobot_camera.py    # Camera operations
โ”‚       โ”œโ”€โ”€ lerobot_calibrate.py # Calibration management
โ”‚       โ”œโ”€โ”€ lerobot_teleoperate.py # Recording/replay
โ”‚       โ”œโ”€โ”€ pose_tool.py         # Pose management
โ”‚       โ””โ”€โ”€ serial_tool.py       # Serial communication
โ”œโ”€โ”€ test.py                      # Integration example
โ””โ”€โ”€ pyproject.toml               # Package configuration

Example: Complete Workflow

#!/usr/bin/env python3
from strands import Agent
from strands_robots import Robot, gr00t_inference, lerobot_camera, pose_tool

# 1. Create robot with dual cameras
robot = Robot(
    tool_name="orange_arm",
    robot="so101_follower",
    cameras={
        "wrist": {"type": "opencv", "index_or_path": "/dev/video0", "fps": 15},
        "front": {"type": "opencv", "index_or_path": "/dev/video2", "fps": 15},
    },
    port="/dev/ttyACM0",
    data_config="so100_dualcam",
)

# 2. Create agent with all robot tools
agent = Agent(
    tools=[robot, gr00t_inference, lerobot_camera, pose_tool]
)

# 3. Start inference service
agent.tool.gr00t_inference(
    action="start",
    checkpoint_path="/data/checkpoints/gr00t-wave/checkpoint-300000",
    port=8000,
    data_config="so100_dualcam",
)

# 4. Interactive control loop
while True:
    user_input = input("\n๐Ÿค– > ")
    if user_input.lower() in ["exit", "quit"]:
        break
    agent(user_input)

# 5. Cleanup
agent.tool.gr00t_inference(action="stop", port=8000)

Contributing

We welcome contributions! Please see:

License

Apache-2.0 - see LICENSE file.

Links

GitHub โ—† PyPI โ—† NVIDIA GR00T โ—† LeRobot โ—† Strands Docs

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

strands_robots-0.3.8.tar.gz (50.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

strands_robots-0.3.8-py3-none-any.whl (56.1 kB view details)

Uploaded Python 3

File details

Details for the file strands_robots-0.3.8.tar.gz.

File metadata

  • Download URL: strands_robots-0.3.8.tar.gz
  • Upload date:
  • Size: 50.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for strands_robots-0.3.8.tar.gz
Algorithm Hash digest
SHA256 d99426876f498ea5828830b5bdd8a951b05f9e125649f605c62a9dedd01ed999
MD5 eb75f239689963fdea9a84c1c12a97d5
BLAKE2b-256 d5a332eb91e4c28824cd27d15e236402dd6a898b153d1c4e3023c4dcd411df94

See more details on using hashes here.

Provenance

The following attestation bundles were made for strands_robots-0.3.8.tar.gz:

Publisher: pypi-publish-on-release.yml on strands-labs/robots

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file strands_robots-0.3.8-py3-none-any.whl.

File metadata

  • Download URL: strands_robots-0.3.8-py3-none-any.whl
  • Upload date:
  • Size: 56.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for strands_robots-0.3.8-py3-none-any.whl
Algorithm Hash digest
SHA256 a13617b9936a32d62a4c5a4919c28ef2c2dd0ef9ddc82f2ad3ec5e5496b73537
MD5 b5189dbfe9342e5027c64749e9ebb1f5
BLAKE2b-256 c1555257dfb11f201278b10f6f8e0cde22ca426a94d26b1a0473728a843021ec

See more details on using hashes here.

Provenance

The following attestation bundles were made for strands_robots-0.3.8-py3-none-any.whl:

Publisher: pypi-publish-on-release.yml on strands-labs/robots

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page