Skip to main content

Lightweight environments to study hierarchical reasoning

Project description

HierarchyCraft - Environements builder for hierarchical reasoning research

Fury - PyPi stable version PePy - Downloads PePy - Downloads per week Licence - GPLv3

Codacy - grade Codacy - coverage CodeStyle - Black CodeStyle - Ruff

HierarchyCraft

HierarchyCraft (hcraft for short) is a Python library designed to create arbitrary hierarchical environments that are compatible with both the OpenAI Gym Reinforcement Learning Framework and AIPlan4EU Unified Planning Framework. This library enables users to easily create complex hierarchical structures that can be used to test and develop various reinforcement learning or planning algorithms.

In environments built with HierarchyCraft the agent (player) has an inventory and can navigate into abstract zones that themselves have inventories.

The action space of HierarchyCraft environments consists of sub-tasks, referred to as Transformations, as opposed to detailed movements and controls. But each Transformations has specific requirements to be valid (eg. have enought of an item, be in the right place), and these requirements may necessitate the execution of other Transformations first, inherently creating a hierarchical structure in HierarchyCraft environments.

This concept is visually represented by the Requirements graph depicting the hierarchical relationships within each HierarchyCraft environment. The Requirements graph is directly constructed from the list of Transformations composing the environement.

More details about requirements graph can be found in the documentation at hcraft.requirements and example of requirements graph for some HierarchyCraft environements can be found in hcraft.examples.

No feature extraction for fast research even with low compute

HierarchyCraft returns vectorized state information, which plainly and directly describes the player's inventory, current positions, and the inventory of the current zone. Compared to benchmarks that return grids, pixel arrays, text or sound, we directly return a low-dimensional latent representation that doesn't need to be learned. Therefore saving compute time and allowing researchers to focus only the the hierarchical reasoning part.

See hcraft.state for more details.

Create your own tailored HierarchyCraft environments

You can use HierarchyCraft to create various custom hierarchical environments from a list of customized Transformations.

See hcraft.env for a complete tutorial on creating custom environments.

Installation

Using pip

Without optional dependencies:

pip install hcraft

With all optional dependencies:

pip install hcraft[all]

All hcraft environments can use a common graphical user interface. With gui requirements:

pip install hcraft[gui]

Gym environment can be obtained throught the with gym requirements:

pip install hcraft[gym]

Planning problems can be obtained throught the upf interface:

pip install hcraft[planning]

Quickstart

Play yourself!

A player knowing Minecraft will find MineHcraft easy.

Install the graphical user interface optional dependencies:

pip install hcraft[gui]

Using the command line interface

You can directly try to play yourself with the GUI available for any HierarchyCraft environments, for example:

hcraft minecraft

For more examples:

hcraft --help

Using the programmatic interface:

from hcraft import get_human_action
from hcraft.examples import MineHcraftEnv

env = MineHcraftEnv()
# or env: MineHcraftEnv = gym.make("MineHcraft-NoReward-v1")
n_episodes = 2
for _ in range(n_episodes):
    env.reset()
    done = False
    total_reward = 0
    while not done:
        env.render()
        action = get_human_action(env)
        print(f"Human pressed: {env.world.transformations[action]}")

        _observation, reward, done, _info = env.step(action)
        total_reward += reward

    print(f"SCORE: {total_reward}")

As a Gym RL environment

Using the programmatic interface, any HierarchyCraft environment can easily be interfaced with classic reinforcement learning agents.

import numpy as np
from hcraft.examples import MineHcraftEnv

def random_legal_agent(observation, action_is_legal):
    action = np.random.choice(np.nonzero(action_is_legal)[0])
    return int(action)

env = MineHcraftEnv(max_step=10)
done = False
observation = env.reset()
while not done:
    action_is_legal = env.action_masks()
    action = random_legal_agent(observation, action_is_legal)
    _observation, _reward, done, _info = env.step(action)
# Other examples of HierarchyCraft environments
from hcraft.examples import TowerHcraft, RecursiveHcraft, RandomHcraft

tower_env = TowerHcraft(height=3, width=2)
# or tower_env = gym.make("TowerHcraft-v1", height=3, width=2)
recursive_env = RecursiveHcraft(n_items=6)
# or recursive_env = gym.make("RecursiveHcraft-v1", n_items=6)
random_env = RandomHcraft(n_items_per_n_inputs={0:2, 1:5, 2:10}, seed=42)
# or random_env = gym.make("RandomHcraft-v1", n_items_per_n_inputs={0:2, 1:5, 2:10}, seed=42)

See hcraft.env for a more complete description.

As a UPF problem for planning

HierarchyCraft environments can be converted to planning problem in one line thanks to the Unified Planning Framework (UPF):

problem = env.planning_problem()
print(problem.upf_problem)

Then they can be solved with any compatible planner (default is enhsp):

problem.solve()
print(problem.plan)

See hcraft.planning for a more complete description.

More about HierarchyCraft

Online documentation

Learn more in the DOCUMENTATION

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hcraft-1.2.4.tar.gz (12.7 MB view hashes)

Uploaded Source

Built Distribution

hcraft-1.2.4-py3-none-any.whl (3.8 MB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page