Continuous and constrained optimization with PyTorch
Project description
pytorCH OPtimize: a library for continuous and constrained optimization built on PyTorch
...with applications to adversarially attacking and training neural networks.
:warning: This library is in early development, API might change without notice. The examples will be kept up to date. :warning:
Stochastic Algorithms
We define stochastic optimizers in the chop.stochastic
module. These follow PyTorch Optimizer conventions, similar to the torch.optim
module.
Full Gradient Algorithms
We also define full-gradient algorithms which operate on a batch of optimization problems in the chop.optim
module. These are used for adversarial attacks, using the chop.Adversary
wrapper.
Installing
Run the following:
git clone https://github.com/openopt/chop.git
cd chop
pip install .
Welcome to chop
!
Examples:
See examples
directory and our webpage.
Tests
Run the tests with pytests tests
.
Citing
If this software is useful to your research, please consider citing it as
@article{chop,
author = {Geoffrey Negiar, Fabian Pedregosa},
title = {CHOP: continuous optimization built on Pytorch},
year = 2020,
url = {http://github.com/openopt/chop}
}
Affiliations
Geoffrey Négiar is in the Mahoney lab and the El Ghaoui lab at UC Berkeley.
Fabian Pedregosa is at Google Research.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for chop_pytorch-0.0.3.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9cd1febd5892d97dbeb9ae1b0b1fd0e576e98795943c942b9c16c2b2b721bb82 |
|
MD5 | 69a6f6f0188dbd4cfc4822b5d88f57d1 |
|
BLAKE2b-256 | 3d0ba07bd9b29e3bd317c23d52a0188bc97908670d148ea014cd71324ccc8309 |