Skip to main content

Multivariate function optimizer based on the tensor train approach.

Project description

ttopt

Description

Gradient-free optimization method for multivariable functions based on the low rank tensor train (TT) format and maximal-volume principle.

Please, see also our software product teneva which provides a very compact implementation of basic operations in the TT-format.

Installation

You can install the ttopt package for python >= 3.7 with pip:

pip install ttopt==0.6.2

Examples

The demo-scripts with detailed comments are collected in the folder demo:

  • base.py - we find the minimum for the 10-dimensional function with vectorized input;
  • qtt.py - we do almost the same as in the base.py script, but use the QTT-based approach (note that results are much more better then in the base.py example);
  • qtt_max.py - we do almost the same as in the qtt.py, but consider the maximization task;
  • qtt_100d.py - we do almost the same as in the qtt.py script, but approximate the 100-dimensional function;
  • vect.py - we find the minimum for the simple analytic function with "simple input" (the function is not vectorized);
  • cache.py - we find the minimum for the simple analytic function to demonstrate the usage of the cache;
  • tensor.py - in this example we find the minimum for the multidimensional array/tensor (i.e., discrete function);
  • tensor_init - we do almost the same as in the tensor.py script, but we use special method of initialization (instead of a random tensor, we select a set of starting multi-indices for the search).

Authors

Citation

If you find this approach and/or code useful in your research, please consider citing:

@article{sozykin2022ttopt,
    author    = {Sozykin, Konstantin and Chertkov, Andrei and Schutski, Roman and Phan, Anh-Huy and Cichocki, Andrzej and Oseledets, Ivan},
    year      = {2022},
    title     = {{TTOpt}: {A} maximum volume quantized tensor train-based optimization and its application to reinforcement learning},
    journal   = {Advances in Neural Information Processing Systems},
    volume    = {35},
    pages     = {26052--26065},
    url       = {https://proceedings.neurips.cc/paper_files/paper/2022/hash/a730abbcd6cf4a371ca9545db5922442-Abstract-Conference.html}
}

Please, note that the calculations presented in this paper correspond to version <0.5.0 of the ttopt package (and to very old version of the teneva package), to run the calculations, please use the appropriate version. In the new versions >=0.6.0, we have removed all the corresponding folders in the folder computations_old. In the future, we will try to update the interface of these experiments.


✭__🚂 The stars that you give to ttopt, motivate us to develop faster and add new interesting features to the code 😃

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ttopt-0.6.2.tar.gz (19.0 kB view details)

Uploaded Source

Built Distribution

ttopt-0.6.2-py3-none-any.whl (15.5 kB view details)

Uploaded Python 3

File details

Details for the file ttopt-0.6.2.tar.gz.

File metadata

  • Download URL: ttopt-0.6.2.tar.gz
  • Upload date:
  • Size: 19.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.17

File hashes

Hashes for ttopt-0.6.2.tar.gz
Algorithm Hash digest
SHA256 77d7cd58e41e4f70f2451f013c96c7992db0cd034b99c5288ab5014dca502b83
MD5 e2c0cfb1f6cf7f5a3d779bb82235bf89
BLAKE2b-256 062f591163d11d33de8a691808ba13a54bc370b9e5556653e8dc26cd5a714ccf

See more details on using hashes here.

File details

Details for the file ttopt-0.6.2-py3-none-any.whl.

File metadata

  • Download URL: ttopt-0.6.2-py3-none-any.whl
  • Upload date:
  • Size: 15.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.17

File hashes

Hashes for ttopt-0.6.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ab6ac5e1261eaa4e34fb0208de27f296904de78ff62b4b6a318ad2f36dfa52ee
MD5 f44567565e322c112c5933db035cc9ae
BLAKE2b-256 51a29479574706205649a7e316fb178bc32f882df43f21607a8bbb36dd4c742e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page