A powerful & flexible Rubik's Cube solver
Project description
AlphaCube
AlphaCube is a powerful & flexible Rubik's Cube solver that extends EfficientCube. It uses a Deep Neural Network (DNN) to find optimal/near-optimal solutions for a given scrambled state.
[!NOTE] 🎮 Try the interactive demo: alphacube.dev
Use Cases
- Solve any scrambled Rubik's Cube configuration with ease.
- Find efficient algorithms, optimizing for either solution length or ergonomic move sequences.
- Incorporate solving capabilities into custom Rubik's Cube applications and tools.
- Analyze the statistical properties and solution space of the Rubik's Cube.
- Illustrate AI/ML concepts such as self-supervised learning and heuristic search.
Installation
Open a terminal and execute the following command:
pip install -U alphacube
Usage
The first time you run alphacube.load(), the required model data will be downloaded and cached on your system for future use.
Basic Usage
import alphacube
# Load a pre-trained model (defaults to "small" on CPU, "large" on GPU)
alphacube.load()
# Solve a scramble
result = alphacube.solve(
scramble="D U F2 L2 U' B2 F2 D L2 U R' F' D R' F' U L D' F' D R2",
beam_width=1024, # Number of candidate solutions to consider at each depth of search
)
print(result)
Output
{ 'solutions': [ "D L D2 R' U2 D B' D' U2 B U2 B' U' B2 D B2 D' B2 F2 U2 F2" ], 'num_nodes': 19744, # Total search nodes explored 'time': 1.4068585219999659 # Time in seconds }
Better Solutions
Increasing beam_width makes the search more exhaustive, yielding shorter solutions at the cost of extra compute:
result = alphacube.solve(
scramble="D U F2 L2 U' B2 F2 D L2 U R' F' D R' F' U L D' F' D R2",
beam_width=65536,
)
print(result)
Output
{ 'solutions': [ "D' R' D2 F' L2 F' U B F D L D' L B D2 R2 F2 R2 F'", "D2 L2 R' D' B D2 B' D B2 R2 U2 L' U L' D' U2 R' F2 R'" ], 'num_nodes': 968984, 'time': 45.690575091997744 }
beam_width values between 1024 and 65536 typically offer a good trade-off between solution quality and speed. Tune according to your needs.
GPU Acceleration
For maximal performance, use the "large" model on a GPU (or MPS if you have Mac).
alphacube.load("large")
result = alphacube.solve(
scramble="D U F2 L2 U' B2 F2 D L2 U R' F' D R' F' U L D' F' D R2",
beam_width=65536,
)
print(result)
Output
{ 'solutions': ["D F L' F' U2 B2 U F' L R2 B2 U D' F2 U2 R D'"], 'num_nodes': 903448, 'time': 20.46845487099995 }
[!IMPORTANT] When running on a CPU, the default
"small"model is recommended. The"base"and"large"models are significantly slower without a GPU.
Please refer to our documentation for more, especially "Getting Started"
Applying Ergonomic Bias
The ergonomic_bias parameter can influence the solver to prefer certain types of moves, generating solutions that might be easier to perform.
# Define desirability for each move type (higher is more desirable)
ergonomic_bias = {
"U": 0.9, "U'": 0.9, "U2": 0.8,
"R": 0.8, "R'": 0.8, "R2": 0.75,
"L": 0.55, "L'": 0.4, "L2": 0.3,
"F": 0.7, "F'": 0.6, "F2": 0.6,
"D": 0.3, "D'": 0.3, "D2": 0.2,
"B": 0.05, "B'": 0.05, "B2": 0.01,
"u": 0.45, "u'": 0.45, "u2": 0.4,
"r": 0.3, "r'": 0.3, "r2": 0.25,
"l": 0.2, "l'": 0.2, "l2": 0.15,
"f": 0.35, "f'": 0.3, "f2": 0.25,
"d": 0.15, "d'": 0.15, "d2": 0.1,
"b": 0.03, "b'": 0.03, "b2": 0.01
}
result = alphacube.solve(
scramble="D U F2 L2 U' B2 F2 D L2 U R' F' D R' F' U L D' F' D R2",
beam_width=65536,
ergonomic_bias=ergonomic_bias
)
print(result)
Output
{ 'solutions': [ "u' U' f' R2 U2 R' L' F' R D2 f2 R2 U2 R U L' U R L", "u' U' f' R2 U2 R' L' F' R D2 f2 R2 U2 R d F' U f F", "u' U' f' R2 U2 R' L' F' R u2 F2 R2 D2 R u f' l u U" ], 'num_nodes': 1078054, 'time': 56.13087955299852 }
How It Works
At its core, AlphaCube uses the deep learning method from "Self-Supervision is All You Need for Solving Rubik's Cube" (TMLR'23), the official code for which is available at kyo-takano/efficientcube.
The provided models ("small", "base", and "large") are compute-optimally trained in the Half-Turn Metric. This means model size and training data were scaled together to maximize prediction accuracy for a given computational budget, as detailed in the paper.
[!NOTE] 📖 Read more: "How It Works" on our documentation site.
Contributing
You are welcome to collaborate on AlphaCube! Please read our Contributing Guide to get started.
License
AlphaCube is open source under the MIT License.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file alphacube-0.1.5-py3-none-any.whl.
File metadata
- Download URL: alphacube-0.1.5-py3-none-any.whl
- Upload date:
- Size: 27.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c5995262982a949690f008aa2beb74f81b094a61090a696dc622c470ed24c10e
|
|
| MD5 |
ed0b6e361f49988d6ce4aeb1458eede5
|
|
| BLAKE2b-256 |
883eb938ff762e2a1e7d385f5e535c321cfa456e58c8fddd69ffeb4926097543
|