Skip to main content

Soft-robotics control environment package for OpenAI Gym

Project description

Soft-Robot Control Environment (gym-softrobot)

The environment is designed to leverage reinforcement learning methods into soft-robotics control. Our inspiration is from slender-body living creatures, such as octopus or snake. The code is based on PyElastica, an open-source physics simulation for slender structure. We intend this package to be easy-to-install and fully compatible to OpenAI Gym.

The package is under development, in Pre-Alpha phase. We are planning to complete the initial set of environment by the end of January 2022.

Requirements:

  • Python 3.8+
  • OpenAI Gym 0.21.0
  • PyElastica 0.2+
  • Matplotlib (optional for display rendering and plotting)
  • POVray (optional for 3D rendering)

Please use this bibtex to cite in your publications:

@misc{gym_softrobot,
  author = {Chia-Hsien Shih, Seung Hyun Kim, Mattia Gazzola},
  title = {Soft Robotics Environment for OpenAI Gym},
  year = {2022},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/skim0119/gym-softrobot}},
}

Installation

pip install gym-softrobot

To test the installation, you can run couple steps of the environment as the following.

import gym 
import gym_softrobot
env = gym.make('OctoFlat-v0', policy_mode='centralized')

# env is created, now we can use it: 
for episode in range(2): 
    observation = env.reset()
    for step in range(50):
        action = env.action_space.sample() 
        observation, reward, done, info = env.step(action)
        print(f"{episode=:2} |{step=:2}, {reward=}, {done=}")
        if done:
            break

We use POVray python wrapper Vapory to visualize the motion in 3D. POVray is not a requirement to run the environment, but it is necessary to use env.render() function as typical gym environment.

Reinforcement Learning Example

We tested the environment using Stable Baselines3 for centralized control. More advanced algorithms are still under development.

If you have your own algorithm that you would like to test out, you are welcome to reach out to us.

Environment Design

Included Environments

Octopus[Multi-arm control]

  • OctoFlat-v0 [Pre-Alpha]
  • OctoReach-v0 [Working in Process]
  • OctoSwim-v0 [Working in Process]
  • OctoHunt-v0 [Working in Process]

Snake

  • 'ContinuumSnake-v0' [Pre-Alpha]

Simple Control

Contribution

We are currently developing the package internally.

Author

GitHub Contributors Image

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gym-softrobot-0.0.7.tar.gz (59.5 kB view hashes)

Uploaded Source

Built Distribution

gym_softrobot-0.0.7-py2.py3-none-any.whl (90.7 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page