Skip to main content

The Full Process Python Package for Robot Learning from Demonstration

Project description

Rofunc: The Full Process Python Package for Robot Learning from Demonstration

Release Documentation Status License Build Status

Repository address: https://github.com/Skylark0924/Rofunc

Rofunc package focuses on the robotic Imitation Learning (IL) and Learning from Demonstration (LfD) fields and provides valuable and convenient python functions for robotics, including demonstration collection, data pre-processing, LfD algorithms, planning, and control methods. We also provide an Isaac Gym-based robot simulator for evaluation. This package aims to advance the field by building a full-process toolkit and validation platform that simplifies and standardizes the process of demonstration data collection, processing, learning, and its deployment on robots.

Installation

Install from PyPI

The installation is very easy,

pip install rofunc

and as you'll find later, it's easy to use as well!

import rofunc as rf

Thus, have fun in the robotics world!

Note Several requirements need to be installed before using the package. Please refer to the installation guide for more details.

Install from Source (Recommended)

git clone https://github.com/Skylark0924/Rofunc.git
cd Rofunc

# Create a conda environment
conda create -n rofunc python=3.8
conda activate rofunc

# Install the requirements and rofunc
pip install -r requirements.txt
pip install .

Note If you want to use functions related to ZED camera, you need to install ZED SDK manually. (We have tried to package it as a .whl file to add it to requirements.txt, unfortunately, the ZED SDK is not very friendly and doesn't support direct installation.)

Star History

Star History Chart

Documentation

Documentation Example Gallery

Note Currently, we provide a simple document; please refer to here. A comprehensive one with both English and Chinese versions is built via the readthedoc. We provide a simple but interesting example: learning to play Taichi by learning from human demonstration.

To give you a quick overview of the pipeline of rofunc, we provide an interesting example of learning to play Taichi from human demonstration. You can find it in the Quick start section of the documentation.

The available functions and plans can be found as follows.

Classes Types Functions Description Status
Demonstration collection and pre-processing Xsens xsens.record Record the human motion via network streaming
xsens.process Decode the .mvnx file
xsens.visualize Show or save gif about the motion
Optitrack optitrack.record Record the motion of markers via network streaming
optitrack.process Process the output .csv data
optitrack.visualize Show or save gif about the motion
ZED zed.record Record with multiple (1~n) cameras
zed.playback Playback the recording and save snapshots
zed.export Export the recording to mp4 or image sequences
Delsys EMG emg.record Record real-time EMG data via network streaming
emg.process Filtering the EMG data
emg.visualize Some visualization functions for EMG data
Multimodal mmodal.record Record multi-modal demonstration data simultaneously
mmodal.export Export multi-modal demonstration data in one line
Learning from Demonstration Machine learning dmp.uni DMP for uni-manual robot with several (or one) demonstrated trajectories
gmr.uni GMR for uni-manual robot with several (or one) demonstrated trajectories
gmm.uni GMM for uni-manual robot with several (or one) demonstrated trajectories
tpgmm.uni TP-GMM for uni-manual robot with several (or one) demonstrated trajectories
tpgmm.bi TP-GMM for bimanual robot with coordination learned from demonstration
tpgmr.uni TP-GMR for uni-manual robot with several (or one) demonstrated trajectories
tpgmr.bi TP-GMR for bimanual robot with coordination learned from demonstration
Deep learning bco Behavior cloning from observation
strans Structured-Transformer method proposed in IEEE RAL
Reinforcement learning SKRL (ppo, sac, tq3) Provide API for SKRL framework and robot examples in Isaac Gym
StableBaseline3 (ppo, sac, tq3) Provide API for StableBaseline3 framework and robot examples in Isaac Gym
RLlib (ppo, sac, tq3) Provide API for Ray RLlib framework and robot examples in Isaac Gym
ElegantRL (ppo, sac, tq3) Provide API for ElegantRL framework and robot examples in Isaac Gym
cql Conservative Q-learning for fully offline learning
Planning LQT lqt.uni Linear Quadratic Tracking (LQT) for uni-manual robot with several via-points
lqt.bi LQT for bimanual robot with coordination constraints
lqt.uni_fb Generate smooth trajectories with feedback
lqt.uni_cp LQT with control primitive
iLQR ilqr.uni Iterative Linear Quadratic Regulator (iLQR) for uni-manual robot with several via-points
ilqr.bi iLQR for bimanual robots with several via-points
ilqr.uni_fb iLQR with feedback
ilqr.uni_cp iLQR with control primitive
ilqr.uni_obstacle iLQR with obstacle avoidance
ilqr.uni_dyna iLQR with dynamics and force control
MPC mpc.uni Model Predictive Control (MPC)
CIO cio.whole Contact-invariant Optimization (CIO)
Tools Logger logger.write General logger based on tensorboard
Config config.get_config General config API based on hydra
VisuaLab visualab.trajectory 2-dim/3-dim/with ori trajectory visualization
visualab.distribution 2-dim/3-dim distribution visualization
visualab.ellipsoid 2-dim/3-dim ellipsoid visualization
RoboLab robolab.transform Useful functions about coordinate transformation
robolab.fk Forward kinematics w.r.t URDF file
robolab.ik Inverse kinematics w.r.t URDF file
robolab.fd Forward dynamics w.r.t URDF file
robolab.id Inverse dynamics w.r.t URDF file
Simulator Franka franka.sim Execute specific trajectory via single Franka Panda arm in Isaac Gym
CURI mini curi_mini.sim Execute specific trajectory via dual Franka Panda arm in Isaac Gym
CURI curi.sim Execute specific trajectory via human-like CURI robot in Isaac Gym
Walker walker.sim Execute specific trajectory via UBTECH Walker robot in Isaac Gym

Cite

If you use rofunc in a scientific publication, we would appreciate citations to the following paper:

@misc{Rofunc2022,
      author = {Liu, Junjia and Li, Zhihao and Li, Chenzui},
      title = {Rofunc: The full process python package for robot learning from demonstration},
      year = {2022},
      publisher = {GitHub},
      journal = {GitHub repository},
      howpublished = {\url{https://github.com/Skylark0924/Rofunc}},
}

The Team

Rofunc is developed and maintained by the CLOVER Lab (Collaborative and Versatile Robot Laboratory), CUHK.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rofunc-0.0.1.2.tar.gz (101.7 MB view hashes)

Uploaded Source

Built Distribution

rofunc-0.0.1.2-py3-none-any.whl (102.1 MB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page