Skip to main content

Differentiation Through Black-Box Quadratic Programming Solvers

Project description

dQP

Differentiation Through Black-Box Quadratic Programming Solvers [Paper]
Magoon*, Yang*, Aigerman, Kovalsky
Accepted. NeurIPS (2025)

teaser teaser

Introduction

dQP is a modular framework for differentiating the solution to a quadratic programming problem (QP) with respect to its parameters, enabling the seamless integration of QPs into machine learning architectures and bilevel optimization. dQP supports over 15 state-of-the-art QP solvers by attaching a custom PyTorch layer to the open-source QP interface [qpsolvers]. It solves problems of the form,

QP

and is used as in the following example which uses QP solver [OSQP] and sparse symmetric indefinite linear solver [QDLDL],

import torch
import numpy as np
from scipy.sparse import csc_matrix

from dqp import dQP
from dqp.sparse_helper import csc_scipy_to_torch

# Define a simple QP: minimize x1^2 + x2^2 s.t. x1 + x2 >= 1, x >= 0
P = csc_matrix(np.array([[2.0, 0.0], [0.0, 2.0]]))
q = np.array([0.0, 0.0])
C = csc_matrix(np.array([[-1.0, -1.0], [-1.0, 0.0], [0.0, -1.0]]))
d = np.array([-1.0, 0.0, 0.0])

# Convert to PyTorch tensors
P_torch = csc_scipy_to_torch(P)
q_torch = torch.tensor(q, dtype=torch.float64, requires_grad=True)
C_torch = csc_scipy_to_torch(C)
d_torch = torch.tensor(d, dtype=torch.float64, requires_grad=True)

# Build settings with OSQP forward solver and qdldl backward solver
settings = dQP.build_settings(
    solve_type="sparse",
    qp_solver="osqp",
    lin_solver="qdldl",
)

# Create layer and solve
layer = dQP.dQP_layer(settings=settings)
x_star, lambda_star, mu_star, _, _ = layer(P_torch, q_torch, C_torch, d_torch)

# Backpropagate (differentiate) through scalar loss L(x^*) = sum(x^*)
x_star.sum().backward()
print(f"Solution: {x_star.detach().numpy()}")  # [0.5, 0.5]
print(f"Gradient w.r.t. d: {d_torch.grad.numpy()}")

The key mathematical structure we use to aid modularity is the special polyhedral geometry of a QP. This is illustrated below, where a QP is locally equivalent to a purely equality constrained problem, i.e., both their solutions and derivatives match.

teaser

Installation

git clone https://github.com/cwmagoon/dQP
cd dQP
pip install -e .

This includes PyTorch, open-source python interfaces to various QP and linear solvers, and tools for sparsity. Some QP solvers such as Gurobi are commercial, but offer [academic licenses]. Experiment-specific packages are detailed in the experiment section.

Options

While dQP described in the snippet above suppresses default options, dQP has the following tunable settings.

Option Meaning
qp_solver QP solver (supported by [qpsolvers]).
lin_solver Linear solver for the derivative (algorithm 1).
solve_type Dense or sparse (CSC format).
eps_abs Absolute tolerance on feasibility, optimality, etc.
eps_rel Relative tolerance.
eps_active Tolerance to decide the active set of constraints.
dual_available Assert whether a given solver outputs duals, if not, solve for them.
normalize_constraints Normalize each row of the constraints, so residuals are distances.
refine_active Use heuristic active set refinement (Fig. 7)
warm_start_from_previous Save previous solution and use to warm-start (useful in bilevel optimization)
omp_parallel Use a simple parallel for loop for the forward over batches.
empty_batch Include an empty batch dimension, even for batch 1.
qp_solver_keywords Additional keywords for qpsolvers.
verbose Display additional information.
time Display timings.
check_PSD Verify that Q is positive semi-definite (dense only), is costly.

Which solver do I choose for my problem? First, we suggest perusing open-source benchmarks and the basic classes of QP solver. For more information, we include a simple diagnostic tool which iterates through available QP solvers and times the forward/backward solves of your example QP (Fig. 6).

Development Setup and Experiments

This script sets up the environment we used for development/experiments
conda create -y --name dQP python=3.9
conda activate dQP

pip install torch==2.3.0+cpu -f https://download.pytorch.org/whl/torch_stable.html scipy numpy qpsolvers 
pip install clarabel cvxopt daqp ecos gurobipy highspy mosek osqp piqp proxsuite qpalm quadprog scs 
pip install qdldl pypardiso 

pip install optnet qpth cvxpylayers proxsuite
    
pip install matplotlib tensorboard pandas

pip install setproctitle

pip install torch_geometric torch_scatter torch_sparse -f https://data.pyg.org/whl/torch-2.3.0+cpu.html
pip install libigl polyscope shapely robust_laplacian torchvision==0.18
conda install -c conda-forge ffmpeg


We provide the code for our experiments in the experiments folder, including some additional directions on running them at the following:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

libdqp-0.1.0.tar.gz (31.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

libdqp-0.1.0-py3-none-any.whl (30.5 kB view details)

Uploaded Python 3

File details

Details for the file libdqp-0.1.0.tar.gz.

File metadata

  • Download URL: libdqp-0.1.0.tar.gz
  • Upload date:
  • Size: 31.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for libdqp-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1a0a7b1da0fab1d2e9e091e7d88a6e0fb376b71b69355f70de263c4c9c029017
MD5 84f2f11b20ad41672fc13c5acb16917e
BLAKE2b-256 988efa0bc71bc3b2bb4379720dbc055f51b2284ee620023cd4f80b110eaf6782

See more details on using hashes here.

File details

Details for the file libdqp-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: libdqp-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 30.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for libdqp-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d77aad84e4c0e9f8e49cf9b3856d7aebf8361cb67111b394af59b8eeaae8307b
MD5 f259d2202864c800c49d54c83f2d8513
BLAKE2b-256 757901e140e043a7ee446417cd13cd46108f9287e8fa5cefbde9ca965f52f3c3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page