Skip to main content

Soft-robotics control environment package for OpenAI Gym

Project description

Soft-robot Control Environment (gym-softrobot)

The environment is designed to leverage reinforcement learning methods into soft-robotics control, inspired from slender-body living creatures. The code is built on PyElastica, an open-source physics simulation for slender structure. We intend this package to be easy-to-install and fully compatible to OpenAI Gym.

Requirements:

  • Python 3.8+
  • OpenAI Gym
  • PyElastica 0.2+
  • Matplotlib (optional for display rendering and plotting)

Please use this bibtex to cite in your publications:

@misc{gym_softrobot,
  author = {Chia-Hsien Shih, Seung Hyun Kim, Mattia Gazzola},
  title = {Soft Robotics Environment for OpenAI Gym},
  year = {2022},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/skim0119/gym-softrobot}},
}

Installation

pip install gym-softrobot

Reinforcement Learning Example

We tested the environment using Stable Baselines3 for centralized control. More advanced algorithms are still under development.

Environment Design

Included Environments

Octopus[Multi-arm control]

  • octo-flat [2D]
  • octo-reach
  • octo-swim
  • octo-flat

Contribution

We are currently developing the package internally.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gym-softrobot-0.0.2.tar.gz (56.7 kB view hashes)

Uploaded Source

Built Distribution

gym_softrobot-0.0.2-py2.py3-none-any.whl (84.3 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page