Skip to main content

QuTree: A tree tensor network package

Project description

pyQuTree

Tests Documentation Status PyPI version

A smaller python version of the Tree Tensor Network library Qutree[^1] currently centered around optimization.

Documentation

Full documentation is available at https://pyqutree-ttn.readthedocs.io

Installation

Install pyQuTree from PyPI:

pip install pyqutree

Or install the latest development version from GitHub:

pip install git+https://github.com/roman-ellerbrock/pyQuTree.git

For developers, create a conda environment via:

conda env {create, update} --file environment.yml
conda activate qutree

Usage

High-Level Interface

For quick optimization of arbitrary functions, use the convenience interface:

from qutree import optimize_function

# Define your function with named parameters
def rosenbrock(x, y):
    return (1 - x)**2 + 100*(y - x**2)**2

# Define parameter bounds
bounds = {'x': (-2, 2), 'y': (-1, 3)}

# Optimize!
result = optimize_function(rosenbrock, bounds)
print(f"Optimal x={result['x']['x']:.3f}, y={result['x']['y']:.3f}")
print(f"Minimum value: {result['fun']:.6f}")

Practical Example - Hyperparameter Tuning:

from qutree import optimize_function
import numpy as np

# Define your objective (e.g., validation error as a function of hyperparameters)
def model_error(learning_rate, batch_size, dropout_rate, l2_reg):
    """Simulate model validation error (replace with your actual model training)."""
    # This is a placeholder - replace with your actual model training/validation
    error = (learning_rate - 0.001)**2 + (batch_size - 64)**2 / 1000
    error += (dropout_rate - 0.3)**2 + (l2_reg - 0.01)**2
    return error

# Define hyperparameter search space
bounds = {
    'learning_rate': (1e-5, 1e-1),
    'batch_size': (16, 128),
    'dropout_rate': (0.0, 0.5),
    'l2_reg': (1e-6, 1e-1)
}

# Use different grid resolutions for different parameters
# (finer grid for parameters you want to optimize more precisely)
grid_points = {
    'learning_rate': 25,  # Fine grid for learning rate
    'batch_size': 15,     # Coarse grid for batch size
    'dropout_rate': 11,   # Coarse grid for dropout
    'l2_reg': 21          # Medium grid for regularization
}

# Optimize hyperparameters
result = optimize_function(model_error, bounds, grid_points=grid_points, n_sweeps=5)

print("Best hyperparameters:")
for param, value in result['x'].items():
    print(f"  {param}: {value:.6f}")
print(f"Best validation error: {result['fun']:.6f}")
print(f"Function evaluations: {result['n_calls']}")

Low-Level Interface

You can also use the low-level tree tensor network API directly:

from qutree import *

def V(x):
    # change with your objective function
    return np.sum((x-np.ones(x.shape[0]))**2)

N, r, f, nsweep = 21, 4, 3, 6

objective = Objective(V)

# create a tensor network, e.g. a balanced tree
tn = balanced_tree(f, r, N)

# Create a primitive grid and tensor network grid
primitive_grid = [linspace(-1., 3., N)] * f

# tensor network optimization
tn_updated = ttnopt(tn, objective, nsweep, primitive_grid)
print(objective)
dataframe = objective.logger.df
print(dataframe)

For detailed tutorials and usage examples, see the documentation:

More examples can be found in examples/ttopt_example.ipynb.

If Qutree was useful in your work, please consider citing the paper[^1].

References

[^1] Roman Ellerbrock, K. Grace Johnson, Stefan Seritan, Hannes Hoppe, J. H. Zhang, Tim Lenzen, Thomas Weike, Uwe Manthe, Todd J. Martínez; QuTree: A tree tensor network package. J. Chem. Phys. 21 March 2024; 160 (11): 112501. https://doi.org/10.1063/5.0180233

[^2] I created the present tree tensor network version which is currently unpublished. It is inspired by Ivan Oseledets, Eugene Tyrtyshnikov, TT-cross approximation for multidimensional arrays, Linear Algebra and its Applications, Volume 432, Issue 1, 2010, Pages 70-88, https://doi.org/10.1016/j.laa.2009.07.024.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyqutree-0.1.5.tar.gz (27.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyqutree-0.1.5-py3-none-any.whl (30.1 kB view details)

Uploaded Python 3

File details

Details for the file pyqutree-0.1.5.tar.gz.

File metadata

  • Download URL: pyqutree-0.1.5.tar.gz
  • Upload date:
  • Size: 27.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/6.6.87.2-microsoft-standard-WSL2

File hashes

Hashes for pyqutree-0.1.5.tar.gz
Algorithm Hash digest
SHA256 e16b5a5cfcf109d1bb2318774b2bbcb938fecd3461b53080a3762bb7fb8491dd
MD5 8a1152e6909dbae25b25a442e1aabfd5
BLAKE2b-256 42ae6acbd434e04d5d0967d0eae5aef34a530c0cc699f1211b416b90e3782c41

See more details on using hashes here.

File details

Details for the file pyqutree-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: pyqutree-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 30.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.3 Linux/6.6.87.2-microsoft-standard-WSL2

File hashes

Hashes for pyqutree-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 db7ee3c549aedc8d62be8d3df99a7a29328f0cf5d0ad4ade4b088d6fa0133877
MD5 a4aa3d21ae3c6455ea362d47d03cddef
BLAKE2b-256 1594911aabfb11a942473269471f91ec874a78392ebdc323ff9244b5e7c38cff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page