Skip to main content

Continual Resilient (CoRe) optimizer for PyTorch

Project description

Introduction

The module core_optimizer provides a PyTorch implementation of the Continual Resilient (CoRe) optimizer. The CoRe optimizer is a first-order gradient-based optimizer for stochastic and deterministic iterative optimizations. It applies weight-specific learning rate adaption depending on the optimization progress. Its algorithm combines Adam- and RPROP-like step size updates and employs weight regularization and decay.

Installation

The module core_optimizer can be installed using pip once the repository has been cloned.

git clone <core_optimizer-repository>
cd <core_optimizer-repository>
python3 -m pip install .

A non super user can install the package using a virtual environment or the --user flag. If there is no space left on device for TMPDIR, one can use TMPDIR=<PATH> in front of python3, with <PATH> being a directory with more space for temporary files.

Usage

The CoRe optimizer can be applied in the same way as the optimizers in torch.optim. The only exception is the import of the optimizer.

from core_optimizer import CoRe

optimizer = CoRe(...)   # dots are placeholders for model parameters and optimizer hyperparameters

For comparison the following code block shows the usage of the Adam optimizer from torch.optim.

from torch.optim import Adam

optimizer = Adam(...)   # dots are placeholders for model parameters and optimizer hyperparameters

The minimal workflow of training a PyTorch model is shown in the next code block. Examples which are ready to go can be found in test_core_optimizer.py.

import torch
from core_optimizer import CoRe

# set up input and expected outputs for training
inputs = torch.tensor(...)   # dots are placeholders for input data
labels = torch.tensor(...)   # dots are placeholders for expected output data

# define loss function, model, and optimizer
loss_fn = torch.nn.modules.loss.TorchLoss()   # TorchLoss is a placeholder for any Torch loss function
model = TorchModel()   # TorchModel is a placeholder for any Torch model
optimizer = CoRe(model.parameters(), step_sizes=(1e-6, 1e-2))   # define CoRe optimizer

# run training steps
for i in range(...):   # dots are a placeholder for number of steps
    optimizer.zero_grad()   # zero optimizer's gradients
    outputs = model(inputs)   # predict output
    loss = loss_fn(outputs, labels)   # calculate loss
    loss.backward()   # calculate loss gradient
    optimizer.step()   # adjust model parameters

The algorithm and all hyperparameters of the CoRe optimizer are explained in the file docs/documentation.pdf.

How to Cite

When publishing results obtained with the CoRe optimizer, please cite M. Eckhoff, M. Reiher, Lifelong Machine Learning Potentials, J. Chem. Theory Comput. 2023, 19, 3509-3525 and M. Eckhoff, M. Reiher, CoRe optimizer: an all-in-one solution for machine learning, Mach. Learn.: Sci. Technol. 2024, 5, 015018.

Support and Contact

In case you encounter any problems or bugs, please write a message to lifelong_ml@phys.chem.ethz.ch.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

core_optimizer-1.1.0.tar.gz (213.1 kB view details)

Uploaded Source

File details

Details for the file core_optimizer-1.1.0.tar.gz.

File metadata

  • Download URL: core_optimizer-1.1.0.tar.gz
  • Upload date:
  • Size: 213.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/28.0 requests/2.25.0 requests-toolbelt/0.9.1 urllib3/1.26.2 tqdm/4.51.0 importlib-metadata/4.8.2 keyring/21.5.0 rfc3986/1.4.0 colorama/0.4.4 CPython/3.6.8

File hashes

Hashes for core_optimizer-1.1.0.tar.gz
Algorithm Hash digest
SHA256 cbaa5476a7576743e4007fab3c518649b4856cf74184f92ef9e72893f2b9cadc
MD5 485c97111a39f01204c36a425ac339c3
BLAKE2b-256 59caa2801b9e42a12fdedc9d29a7912374d767657ba323e7a5afe7b981e5d34e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page