Skip to main content

A library for Lagrangian-based constrained optimization in PyTorch

Project description

Cooper

LICENSE Version Python PyTorch DOCS Coverage badge Continuous Integration Stars HitCount contributions welcome Discord Ruff

What is Cooper?

Cooper is a library for solving constrained optimization problems in PyTorch.

Cooper implements several Lagrangian-based (first-order) update schemes that are applicable to a wide range of continuous constrained optimization problems. Cooper is mainly targeted for deep learning applications, where gradients are estimated based on mini-batches, but it is also suitable for general continuous constrained optimization tasks.

There exist other libraries for constrained optimization in PyTorch, like CHOP and GeoTorch, but they rely on assumptions about the constraints (such as admitting efficient projection or proximal operators). These assumptions are often not met in modern machine learning problems. Cooper can be applied to a wider range of constrained optimization problems (including non-convex problems) thanks to its Lagrangian-based approach.

You can check out Cooper's FAQ here.

Cooper's companion paper is available here.

Installation

To install the latest release of Cooper, use the following command:

pip install cooper-optim

To install the latest development version, use the following command instead:

pip install git+https://github.com/cooper-org/cooper@main

Getting Started

Quick Start

To use Cooper, you need to:

Example

This is an abstract example on how to solve a constrained optimization problem with Cooper. You can find runnable notebooks with concrete examples in our Tutorials.

import cooper
import torch

# Set up GPU acceleration
DEVICE = ...

class MyCMP(cooper.ConstrainedMinimizationProblem):
    def __init__(self):
        super().__init__()
        multiplier = cooper.multipliers.DenseMultiplier(num_constraints=..., device=DEVICE)
        # By default, constraints are built using `formulation_type=cooper.formulations.Lagrangian`
        self.constraint = cooper.Constraint(
            multiplier=multiplier, constraint_type=cooper.ConstraintType.INEQUALITY
        )

    def compute_cmp_state(self, model, inputs, targets):
        inputs, targets = inputs.to(DEVICE), targets.to(DEVICE)
        loss = ...
        constraint_state = cooper.ConstraintState(violation=...)
        observed_constraints = {self.constraint: constraint_state}

        return cooper.CMPState(loss=loss, observed_constraints=observed_constraints)


train_loader = ...
model = (...).to(DEVICE)
cmp = MyCMP()

primal_optimizer = torch.optim.Adam(model.parameters(), lr=1e-3)
# Must set `maximize=True` since the Lagrange multipliers solve a _maximization_ problem
dual_optimizer = torch.optim.SGD(cmp.dual_parameters(), lr=1e-2, maximize=True)

cooper_optimizer = cooper.optim.SimultaneousOptimizer(
    cmp=cmp, primal_optimizers=primal_optimizer, dual_optimizers=dual_optimizer
)

for epoch_num in range(NUM_EPOCHS):
    for inputs, targets in train_loader:
        # `roll` is a convenience function that packages together the evaluation
        # of the loss, call for gradient computation, the primal and dual updates and zero_grad
        compute_cmp_state_kwargs = {"model": model, "inputs": inputs, "targets": targets}
        roll_out = cooper_optimizer.roll(compute_cmp_state_kwargs=compute_cmp_state_kwargs)
        # `roll_out` is a namedtuple containing the loss, last CMPState, and the primal
        # and dual Lagrangian stores, useful for inspection and logging

Contributions

We appreciate all contributions. Please let us know if you encounter a bug by filing an issue.

If you plan to contribute new features, utility functions, or extensions, please first open an issue and discuss the feature with us. To learn more about making a contribution to Cooper, please see our Contribution page.

Papers Using Cooper

Cooper has enabled several papers published at top machine learning conferences: Gallego-Posada et al. (2022); Lachapelle and Lacoste-Julien (2022); Ramirez and Gallego-Posada (2022); Zhu et al. (2023); Hashemizadeh et al. (2024); Sohrabi et al. (2024); Lachapelle et al. (2024); Jang et al. (2024); Navarin et al. (2024); Chung et al. (2024).

Acknowledgements

We thank Manuel Del Verme, Daniel Otero, and Isabel Urrego for useful discussions during the early stages of Cooper.

Many Cooper features arose during the development of several research papers. We would like to thank our co-authors Yoshua Bengio, Juan Elenter, Akram Erraqabi, Golnoosh Farnadi, Ignacio Hounie, Alejandro Ribeiro, Rohan Sukumaran, Motahareh Sohrabi and Tianyue (Helen) Zhang.

License

Cooper is distributed under an MIT license, as found in the LICENSE file.

How to cite Cooper

To cite Cooper, please cite this paper:

@article{gallegoPosada2025cooper,
    author={Gallego-Posada, Jose and Ramirez, Juan and Hashemizadeh, Meraj and Lacoste-Julien, Simon},
    title={{Cooper: A Library for Constrained Optimization in Deep Learning}},
    journal={arXiv preprint arXiv:2504.01212},
    year={2025}
}

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cooper_optim-1.0.1.tar.gz (253.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cooper_optim-1.0.1-py3-none-any.whl (48.7 kB view details)

Uploaded Python 3

File details

Details for the file cooper_optim-1.0.1.tar.gz.

File metadata

  • Download URL: cooper_optim-1.0.1.tar.gz
  • Upload date:
  • Size: 253.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for cooper_optim-1.0.1.tar.gz
Algorithm Hash digest
SHA256 cbd6a6415d7fa0298eae9ac10641b6ce690fb96a2d3fe3864e78cf69b1af04c4
MD5 f7d682f4ad30cd19dc9496a90a409bde
BLAKE2b-256 57e2a2c7d9b0bb4a48c51028654a1a8b0f55b16fa455bb5a5a68b6c5abfda760

See more details on using hashes here.

Provenance

The following attestation bundles were made for cooper_optim-1.0.1.tar.gz:

Publisher: release.yaml on cooper-org/cooper

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file cooper_optim-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: cooper_optim-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 48.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for cooper_optim-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9eae6abf95b2f161868e8b12fbd91796630e7fd062e79932ec0776625f2d892f
MD5 f43bcda284d9cbea4b20921ca7285f41
BLAKE2b-256 9997fc482c2a6b3629f065f51df2f96d91ed33c65b4974db8d50c896d9c0e7da

See more details on using hashes here.

Provenance

The following attestation bundles were made for cooper_optim-1.0.1-py3-none-any.whl:

Publisher: release.yaml on cooper-org/cooper

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page