Skip to main content

Gaussian Process Temporal Embedding for Protein Simulations and Transitions

Project description

GP-TEMPEST

GP-TEMPEST

Gaussian Process Temporal Embedding for Protein Simulations and Transitions

DOI License: MIT Tests Coverage Python 3.9+ PyTorch Docs PyPI Downloads

DocumentationFeaturesInstallationUsageCitation


GP-TEMPEST is a PyTorch implementation of the Gaussian Process Variational Autoencoder (GP-VAE) framework for time-aware dimensionality reduction of molecular dynamics (MD) simulations. The method leverages physics-informed Gaussian Process priors to capture temporal correlations in the latent space, enabling the recovery of hidden or kinetically relevant degrees of freedom in complex biomolecular systems.

Features

  • Physics-informed dimensionality reduction using Gaussian Processes as temporal priors
  • Flexible kernel selection with support for the Matérn kernel (ν = 0.5, 1.5, 2.5)
  • Sparse GP inference with inducing points for scalability to large molecular trajectories
  • Recovery of hidden degrees of freedom not accessible in any projection of the input data
  • Free-energy landscapes and kinetic insight from GP-smoothed, physically interpretable latent coordinates

Installation

pip install gp-tempest

Note: PyTorch is listed as a dependency but pip will install the CPU version by default. For GPU support install torch manually first:

pip install torch --index-url https://download.pytorch.org/whl/cu118
pip install gp-tempest

From source:

git clone https://github.com/moldyn/GP-TEMPEST.git
cd GP-TEMPEST
pip install -e .

Usage

Command-line interface

Fully-connected variant:

# Generate a default config file
python tempest_main.py --generate_config

# Run with your config
python tempest_main.py --config my_config.yaml

Python API

import numpy as np
import torch
from gptempest import TEMPEST, MaternKernel, load_prepare_data

# Set up kernel and model
kernel = MaternKernel(scale=10.0, nu=1.5, dtype=torch.float64)
inducing_points = np.linspace(0, 1, 50)

model = TEMPEST(
    cuda=False,
    kernel=kernel,
    dim_input=dim_input,
    dim_latent=2,
    layers_hidden_encoder=[128, 64],
    layers_hidden_decoder=[64, 128],
    inducing_points=inducing_points,
    beta=1.0,
    N_data=N_data,
    dtype=torch.float64,
)

# Train
model.train_model(dataset, train_size=1.0, learning_rate=1e-3,
                  weight_decay=1e-5, batch_size=512, n_epochs=100)

# Extract latent space
embedding = model.extract_latent_space(dataset, batch_size=512)

Configuration file

GP-TEMPEST is configured via YAML files. Generate a template with --generate_config and adjust the following key parameters. The discussion of these parameters can be found in the paper.

Parameter Description
dim_latent Dimensionality of the latent space (typically 2)
layers_hidden Hidden layer sizes for encoder/decoder
kernel_nu Matérn kernel smoothness (0.5, 1.5, or 2.5)
kernel_scale Time-scale of the GP prior
beta Weight of the GP regularization term
inducing_points Path to inducing point time coordinates

Citation

If you use GP-TEMPEST in your research, please cite:

@article{diez2025gptempest,
  title   = {Recovering Hidden Degrees of Freedom Using Gaussian Processes},
  author  = {Diez, Georg and Dethloff, Nele and Stock, Gerhard},
  journal = {J. Chem. Phys.},
  volume  = {163},
  pages   = {124105},
  year    = {2025},
  doi     = {10.1063/5.0282147}
}

G. Diez, N. Dethloff, G. Stock, "Recovering Hidden Degrees of Freedom Using Gaussian Processes," J. Chem. Phys. 163, 124105 (2025), https://doi.org/10.1063/5.0282147

License

This project is licensed under the MIT License — see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gp_tempest-0.1.1.tar.gz (18.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gp_tempest-0.1.1-py3-none-any.whl (19.3 kB view details)

Uploaded Python 3

File details

Details for the file gp_tempest-0.1.1.tar.gz.

File metadata

  • Download URL: gp_tempest-0.1.1.tar.gz
  • Upload date:
  • Size: 18.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for gp_tempest-0.1.1.tar.gz
Algorithm Hash digest
SHA256 7e65e1582c0f75de68f4ad3d5fd63ce8474b392b710fae70be000e1fe37eab75
MD5 fa66b740ae7e362949c988157a23eb90
BLAKE2b-256 f7e9094ca7b7148deadb39af7266f04ba1c74fbed1f94e01573b415cfb5b0d04

See more details on using hashes here.

Provenance

The following attestation bundles were made for gp_tempest-0.1.1.tar.gz:

Publisher: publish.yml on moldyn/GP-TEMPEST

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gp_tempest-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: gp_tempest-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 19.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for gp_tempest-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1fd4ded783b553bbd5b2ab78d3f500e408cae767a574e5acc486ca3f4c11a855
MD5 7b62ea3966713a9f3c965890fbb64cf7
BLAKE2b-256 8e6a23cc2a5e9492941c066b3864dbbd90c8694ffc7501744c86bc691830a990

See more details on using hashes here.

Provenance

The following attestation bundles were made for gp_tempest-0.1.1-py3-none-any.whl:

Publisher: publish.yml on moldyn/GP-TEMPEST

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page