Let AI control real robots โ Python robotics framework for servo control, motion teaching, and AI agent integration
Project description
๐ฆพ RobotClaw
Let AI Control Real Robots โ From Code to Physical World in One Line
๐ง One
pip installaway from giving your AI agent a real body.
RobotClaw is a Python robotics framework that bridges AI agents with physical robots. It turns high-level AI commands into real-world servo movements โ enabling your AI to walk, wave, dance, and interact with the physical world.
Built for the OpenClaw AI agent platform. Compatible with LOBOT LX series bus servos. Designed for makers, researchers, and anyone who wants to bring robots to life.
pip install robotclaw
๐ฌ What Can You Do?
# Your AI agent says: "่ฎฉๆบๅจไบบๆฅๆ"
# RobotClaw makes it happen in the real world:
from robotclaw import Robot, DEFAULT_CONFIG
with Robot(config=DEFAULT_CONFIG, port="COM3") as robot:
robot.go_home(time_ms=2000) # Stand up
robot.set_joint("left_hip_pitch", # Move a joint
position=350,
time_ms=500)
๐ค User: "่ฎฉๆบๅจไบบๅๅฐๅๅงไฝ็ฝฎ"
๐ง AI: โ robot.go_home(2000) โ
๐ค User: "ๆญๆพ่ตฐ่ทฏๅจไฝ๏ผ2ๅ้"
๐ง AI: โ player.play(walk_clip, speed=2.0) โ
๐ค User: "ๆฃๆฅๆๆ่ตๆบ็็ตๅ"
๐ง AI: โ robot.scan() โ ๆฅๅ็ตๅ็ถๆ โ
No robotics PhD required. If you can write Python, you can control a robot.
โจ Why RobotClaw?
| Feature | Description |
|---|---|
| ๐ง AI-Native | First-class integration with OpenClaw โ AI agents control robots via natural language |
| ๐ Plug & Play | Connect USB, pip install, and you're controlling servos in 3 lines of code |
| ๐ฌ Motion Teaching | Record human demonstrations, save as JSON, replay at any speed |
| ๐ค High-Level API | Think in joints (left_hip_pitch), not raw servo IDs |
| ๐ ๏ธ CLI Tools | robotclaw-scan for diagnostics, robotclaw-teach for interactive teaching |
| โก Thread-Safe | Production-grade mutex-protected serial communication |
| ๐ Pure Python | Zero compiled dependencies โ runs anywhere Python runs |
๐ Quick Start
1. Install
pip install robotclaw
2. Connect & Control
from robotclaw import ServoBus
bus = ServoBus()
bus.connect("COM3", baudrate=115200)
# Move servo to position 500 in 1 second
bus.move(servo_id=1, position=500, time_ms=1000)
# Read current position
pos = bus.read_position(servo_id=1)
print(f"Servo position: {pos}")
bus.disconnect()
3. Build a Walking Robot
from robotclaw import Robot, DEFAULT_CONFIG
with Robot(config=DEFAULT_CONFIG, port="COM3") as robot:
# All 10 joints to home position
robot.go_home(time_ms=2000)
# Control individual joints by name
robot.set_joint("left_hip_pitch", position=350, time_ms=500)
robot.set_joint("right_knee", position=600, time_ms=500)
# Read all joint positions at once
positions = robot.get_positions()
print(positions)
4. Teach Your Robot New Moves
No programming needed โ just physically move the robot and record:
from robotclaw import Robot, DEFAULT_CONFIG
from robotclaw.recorder import MotionRecorder, MotionPlayer
with Robot(config=DEFAULT_CONFIG, port="COM3") as robot:
# Step 1: Record a motion by posing the robot
recorder = MotionRecorder(robot)
recorder.start_recording("wave_hand")
robot.unload_all() # Release servos so you can pose by hand
input("Pose the robot โ Press Enter to capture...")
recorder.capture_frame()
input("Next pose โ Press Enter...")
recorder.capture_frame()
clip = recorder.finish_recording()
recorder.save(clip, "motions/wave_hand.json")
# Step 2: Replay at any speed
robot.load_all()
player = MotionPlayer(robot)
player.play(clip, speed=1.5) # 1.5x speed playback
5. CLI Tools
# Discover all connected servos
robotclaw-scan --port COM3
# Interactive motion teaching terminal
robotclaw-teach --port COM3
๐ง AI Integration โ The Killer Feature
RobotClaw is designed from the ground up for AI-driven robotics. It ships with a built-in OpenClaw skill that lets AI agents understand and control the robot through natural language:
from robotclaw.openclaw_skill import OpenClawSkill
# Register with your AI agent
skill = OpenClawSkill(port="COM3")
# Now your AI can:
# - Move joints by name
# - Play recorded motions
# - Read sensor data (position, voltage, temperature)
# - Perform diagnostic scans
# - Chain complex movement sequences
Use Cases:
- ๐ Education โ Students learn robotics through conversation with AI
- ๐ญ Prototyping โ Rapidly test robot behaviors via natural language
- ๐ฎ Entertainment โ AI-controlled robot performances and interactions
- ๐ฌ Research โ Quickly iterate on movement patterns without manual coding
๐ง Supported Hardware
| Component | Specification |
|---|---|
| Servos | LOBOT LX-16A / LX-224 / LX-225 bus servos |
| Interface | USB-to-TTL serial adapter |
| Baudrate | 115200 (direct) or 9600 (controller board) |
| Power | 6โ8.4V DC, โฅ5A recommended for 10 servos |
| Default Config | 10 servos, 5 per leg (biped robot) |
๐ก Tip: More servo types and robot configurations coming soon. PRs welcome!
๐ Architecture
src/robotclaw/
โโโ __init__.py # Public API: ServoBus, Robot, DEFAULT_CONFIG
โโโ servo_bus.py # LOBOT LX protocol โ the low-level driver
โโโ robot_config.py # JointConfig, LegConfig, RobotConfig
โโโ robot.py # High-level robot controller
โโโ cli.py # CLI: robotclaw-scan, robotclaw-teach
โโโ openclaw_skill.py # OpenClaw AI agent skill adapter
โโโ SKILL.md # OpenClaw skill descriptor
โโโ recorder/ # Motion teaching subsystem
โโโ motion_data.py # Keyframe & MotionClip data models
โโโ recorder.py # Record motions from physical teaching
โโโ player.py # Play back motions with speed control
๐บ๏ธ Roadmap
- LOBOT LX bus servo protocol driver
- High-level joint-based robot API
- Motion recording & playback system
- OpenClaw AI agent integration
- CLI diagnostic & teaching tools
- ๐ Visual motion editor (web UI)
- ๐ Inverse kinematics engine
- ๐ Support for more servo protocols (Dynamixel, Feetech)
- ๐ ROS 2 bridge
- ๐ Reinforcement learning integration
๐ค Contributing
We welcome contributions from robotics enthusiasts, AI researchers, and Python developers!
# Clone and install in development mode
git clone https://github.com/RobotBase/robotclaw.git
cd robotclaw
pip install -e ".[dev]"
# Run tests
python -m pytest tests/ -v
See the CHANGELOG for recent updates.
๐ License
MIT License โ Use it freely in your projects, commercial or otherwise.
Built with โค๏ธ by the RobotBase team
Making AI-powered robotics accessible to everyone
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file robotclaw-0.2.1.tar.gz.
File metadata
- Download URL: robotclaw-0.2.1.tar.gz
- Upload date:
- Size: 26.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
88f7414fa02660ff6dde034373d209db536ee71c7d8fe7834bd8c32af71ca30e
|
|
| MD5 |
94b7cb191ef6316ac801fd1d808cf4fb
|
|
| BLAKE2b-256 |
77fbf62886e0c0c9329afdedff014c1e67bcb62931291c5a8bf6ad1318592e35
|
File details
Details for the file robotclaw-0.2.1-py3-none-any.whl.
File metadata
- Download URL: robotclaw-0.2.1-py3-none-any.whl
- Upload date:
- Size: 23.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d153c319fe4db2a9120fdfe8a3c6fd0d7887aee6f5c9f27d002332b2f64e8fdc
|
|
| MD5 |
ad8585611d0a2b9e950fd20bde0e288a
|
|
| BLAKE2b-256 |
132f7386990ac11ee8b119c1a4bf371b1871cdfbbf2416ed612b15e896e91e9c
|