Skip to main content

Rofunc: The Full Process Python Package for Robot Learning from Demonstration and Robot Manipulation

Project description

Rofunc: The Full Process Python Package for Robot Learning from Demonstration and Robot Manipulation

Release License Documentation Status Build Status

Repository address: https://github.com/Skylark0924/Rofunc

Rofunc package focuses on the Imitation Learning (IL), Reinforcement Learning (RL) and Learning from Demonstration ( LfD) for (Humanoid) Robot Manipulation. It provides valuable and convenient python functions, including demonstration collection, data pre-processing, LfD algorithms, planning, and control methods. We also provide an Isaac Gym-based robot simulator for evaluation. This package aims to advance the field by building a full-process toolkit and validation platform that simplifies and standardizes the process of demonstration data collection, processing, learning, and its deployment on robots.

Installation

Install from PyPI (stable version)

The installation is very easy,

pip install rofunc

# [Option] Install with baseline RL frameworks (SKRL, RLlib, Stable Baselines3) and Envs (gymnasium[all], mujoco_py)
pip install rofunc[baselines]

and as you'll find later, it's easy to use as well!

import rofunc as rf

Thus, have fun in the robotics world!

Note Several requirements need to be installed before using the package. Please refer to the installation guide for more details.

Install from Source (nightly version, recommended)

git clone https://github.com/Skylark0924/Rofunc.git
cd Rofunc

# Create a conda environment
# Python 3.8 is strongly recommended
conda create -n rofunc python=3.8

# For Linux user
sh ./scripts/install.sh
# [Option] Install with baseline RL frameworks (SKRL, RLlib, Stable Baselines3)
sh ./scripts/install_w_baselines.sh
# [Option] For MacOS user (brew is required, Isaac Gym based simulator is not supported on MacOS)
sh ./scripts/mac_install.sh

Note If you want to use functions related to ZED camera, you need to install ZED SDK manually. (We have tried to package it as a .whl file to add it to requirements.txt, unfortunately, the ZED SDK is not very friendly and doesn't support direct installation.)

Documentation

Documentation Example Gallery

To give you a quick overview of the pipeline of rofunc, we provide an interesting example of learning to play Taichi from human demonstration. You can find it in the Quick start section of the documentation.

The available functions and plans can be found as follows.

Note โœ…: Achieved ๐Ÿ”ƒ: Reformatting โ›”: TODO

Data Learning P&C Tools Simulator
xsens.record โœ… DMP โ›” LQT โœ… config โœ… Franka โœ…
xsens.export โœ… GMR โœ… LQTBi โœ… logger โœ… CURI โœ…
xsens.visual โœ… TPGMM โœ… LQTFb โœ… datalab โœ… CURIMini ๐Ÿ”ƒ
opti.record โœ… TPGMMBi โœ… LQTCP โœ… robolab.coord โœ… CURISoftHand โœ…
opti.export โœ… TPGMM_RPCtl โœ… LQTCPDMP โœ… robolab.fk โœ… Walker โœ…
opti.visual โœ… TPGMM_RPRepr โœ… LQR โœ… robolab.ik โœ… Gluon ๐Ÿ”ƒ
zed.record โœ… TPGMR โœ… PoGLQRBi โœ… robolab.fd โ›” Baxter ๐Ÿ”ƒ
zed.export โœ… TPGMRBi โœ… iLQR ๐Ÿ”ƒ robolab.id โ›” Sawyer ๐Ÿ”ƒ
zed.visual โœ… TPHSMM โœ… iLQRBi ๐Ÿ”ƒ visualab.dist โœ… Humanoid โœ…
emg.record โœ… RLBaseLine(SKRL) โœ… iLQRFb ๐Ÿ”ƒ visualab.ellip โœ… Multi-Robot โœ…
emg.export โœ… RLBaseLine(RLlib) โœ… iLQRCP ๐Ÿ”ƒ visualab.traj โœ…
mmodal.record โ›” RLBaseLine(ElegRL) โœ… iLQRDyna ๐Ÿ”ƒ oslab.dir_proc โœ…
mmodal.sync โœ… BCO(RofuncIL) ๐Ÿ”ƒ iLQRObs ๐Ÿ”ƒ oslab.file_proc โœ…
BC-Z(RofuncIL) โ›” MPC โ›” oslab.internet โœ…
STrans(RofuncIL) โ›” RMP โ›” oslab.path โœ…
RT-1(RofuncIL) โ›”
A2C(RofuncRL) โœ…
PPO(RofuncRL) โœ…
SAC(RofuncRL) โœ…
TD3(RofuncRL) โœ…
CQL(RofuncRL) โ›”
TD3BC(RofuncRL) โ›”
DTrans(RofuncRL) โœ…
EDAC(RofuncRL) โ›”
AMP(RofuncRL) โœ…
ASE(RofuncRL) โœ…
ODTrans(RofuncRL) โ›”

Star History

Star History Chart

Citation

If you use rofunc in a scientific publication, we would appreciate citations to the following paper:

@software{liu2023rofunc,
          title={Rofunc: The full process python package for robot learning from demonstration and robot manipulation},
          author={Liu, Junjia and Li, Chenzui and Delehelle, Donatien and Li, Zhihao and Chen, Fei},
          month=jun,
          year= 2023,
          publisher={Zenodo},
          doi={10.5281/zenodo.8084510},
          url={https://doi.org/10.5281/zenodo.8084510}
}

Related Papers

  1. Robot cooking with stir-fry: Bimanual non-prehensile manipulation of semi-fluid objects (IEEE RA-L 2022 | Code)
@article{liu2022robot,
         title={Robot cooking with stir-fry: Bimanual non-prehensile manipulation of semi-fluid objects},
         author={Liu, Junjia and Chen, Yiting and Dong, Zhipeng and Wang, Shixiong and Calinon, Sylvain and Li, Miao and Chen, Fei},
         journal={IEEE Robotics and Automation Letters},
         volume={7},
         number={2},
         pages={5159--5166},
         year={2022},
         publisher={IEEE}
}
  1. SoftGPT: Learn Goal-oriented Soft Object Manipulation Skills by Generative Pre-trained Heterogeneous Graph Transformer (IROS 2023๏ฝœCode coming soon)
@article{liu2023softgpt,
        title={SoftGPT: Learn Goal-oriented Soft Object Manipulation Skills by Generative Pre-trained Heterogeneous Graph Transformer},
        author={Liu, Junjia and Li, Zhihao and Calinon, Sylvain and Chen, Fei},
        journal={arXiv preprint arXiv:2306.12677},
        year={2023}
}
  1. BiRP: Learning Robot Generalized Bimanual Coordination using Relative Parameterization Method on Human Demonstration (IEEE CDC 2023 | Code)
@article{liu2023birp,
        title={BiRP: Learning Robot Generalized Bimanual Coordination using Relative Parameterization Method on Human Demonstration},
        author={Liu, Junjia and Sim, Hengyi and Li, Chenzui and Chen, Fei},
        journal={arXiv preprint arXiv:2307.05933},
        year={2023}
}

The Team

Rofunc is developed and maintained by the CLOVER Lab (Collaborative and Versatile Robots Laboratory), CUHK.

Acknowledge

We would like to acknowledge the following projects:

Learning from Demonstration

  1. pbdlib
  2. Ray RLlib
  3. ElegantRL
  4. SKRL

Planning and Control

  1. Robotics codes from scratch (RCFS)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rofunc-0.0.2.5.tar.gz (100.9 MB view details)

Uploaded Source

Built Distribution

rofunc-0.0.2.5-py3-none-any.whl (101.6 MB view details)

Uploaded Python 3

File details

Details for the file rofunc-0.0.2.5.tar.gz.

File metadata

  • Download URL: rofunc-0.0.2.5.tar.gz
  • Upload date:
  • Size: 100.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.17

File hashes

Hashes for rofunc-0.0.2.5.tar.gz
Algorithm Hash digest
SHA256 b8f87b9542fafadc1dd14658a08e4b13e44e2eae1c5a5079bf4da09d0ce12017
MD5 e42bc83e06e058f277203ca236af4908
BLAKE2b-256 7302297d327ff76b9d5e59cb0f44a63666aa38cab270aa9e2f24f11374694fc9

See more details on using hashes here.

File details

Details for the file rofunc-0.0.2.5-py3-none-any.whl.

File metadata

  • Download URL: rofunc-0.0.2.5-py3-none-any.whl
  • Upload date:
  • Size: 101.6 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.17

File hashes

Hashes for rofunc-0.0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 1edeea9fc9c207eb3f3c7853fda47dd7dada791f352dd3465c2533e15d170eaa
MD5 9c820eacbc20d56f68549000104bd842
BLAKE2b-256 4b740ba05903b21287a5bf0283f05ca6a4acdf9b20bc61212399e1dd4a87a800

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page