Skip to main content

VMC framework for Tensorflow

Project description

FlowKet - A Python framework for variational Monte-Carlo simulations on top of Tensorflow

FlowKet is our framework for running variational Monte-Carlo simulations of quantum many-body systems. It supports any Keras model for representing a parameterized unnormalized wave-function, e.g., Restricted Boltzman Machines and ConvNets, with real or complex-valued parameters. We have implemented a standard Markov-Chain Monte-Carlo (MCMC) energy gradient estimator for this general case, which can be used to approximate the ground state of a quantum system according to a given Hamiltonian. The neural-network-based approach for representing wave-fucntions was shown to be a promising method for solving the many-body problem, often matching or even surpassing the precision of other competing methods.

In addition to an MCMC energy gradient estimator, we have also implemented our novel Neural Autoregressive Quantum State wave-function representation that supports efficient and exact sampling. By overcoming the reliance on MCMC, our models can converge much faster for models of same size, which allows us to scale them to millions of parameters, as opposed to just a few thousands for prior approaches. This leads to better precison and ability to invesitgate larger and more intricated systems. Please read our paper (arXiv version), cited below, for further details on this approach. We hope that users of our library will be able to take our method and apply to a variety of problems. If you use this codebase or apply our method, we would appreciate if you cite us as follows:

@article{PhysRevLett.124.020503,
  title = {Deep Autoregressive Models for the Efficient Variational Simulation of Many-Body Quantum Systems},
  author = {Sharir, Or and Levine, Yoav and Wies, Noam and Carleo, Giuseppe and Shashua, Amnon},
  journal = {Phys. Rev. Lett.},
  volume = {124},
  issue = {2},
  pages = {020503},
  numpages = {6},
  year = {2020},
  month = {Jan},
  publisher = {American Physical Society},
  doi = {10.1103/PhysRevLett.124.020503},
  url = {https://link.aps.org/doi/10.1103/PhysRevLett.124.020503}
}

Installation

FlowKet assumes Tensorflow is already part of the enviornment. We currently support Tensorflow 1.10-1.14, but plan to extend support to all >=1.10+ and 2.0.

The recommended way to intall FlowKet is via PyPI:

pip install flowket

Alternatively if you wish to work on extending our library, you can clone our project and instead run:

pip install -e /path/to/local/repo

Basic Tutorial

While we are working on writing a proper tutorial on using the framework, we suggest going through the example files.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowket-0.2.3.tar.gz (56.4 kB view details)

Uploaded Source

Built Distribution

flowket-0.2.3-py3-none-any.whl (89.1 kB view details)

Uploaded Python 3

File details

Details for the file flowket-0.2.3.tar.gz.

File metadata

  • Download URL: flowket-0.2.3.tar.gz
  • Upload date:
  • Size: 56.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.0 requests/2.24.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.9.0

File hashes

Hashes for flowket-0.2.3.tar.gz
Algorithm Hash digest
SHA256 dd4cea85b2b3dcc5d02b7368a6b644d6cd5f733cd4a0159b28aa32b662148ec1
MD5 f1b72c7f85c5ecffed442437faf564d7
BLAKE2b-256 82385db1ef51ae2f73a9c598ce05b13b21890e2465c4940b93eed2dd40afbaa6

See more details on using hashes here.

File details

Details for the file flowket-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: flowket-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 89.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.0 requests/2.24.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.9.0

File hashes

Hashes for flowket-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 b1053827ef8e08a46b1b2ddb0995755189e6c7865675093e84ecef36762efd0f
MD5 decc9c08a7ac34c1bf04aae04236c94f
BLAKE2b-256 90fbdb13316a6b750a1a957b9b61a60ddb0274a12682bf021daf24234e8d182a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page