Skip to main content

Nonlinear Progamming with CasADi

Project description

NonLinear Programming with CasADi

CasADi-NLP (csnlp, for short) is a library that provides classes and utilities to model, solve and analyse nonlinear (but not only) programmes (NLPs) for optimization purposes.

Documentation https://casadi-nlp.readthedocs.io/en/latest/
Download https://pypi.python.org/pypi/csnlp/
Source code https://github.com/FilippoAiraldi/casadi-nlp/
Report issues https://github.com/FilippoAiraldi/casadi-nlp/issues/

PyPI version Source Code License Python 3.9

Tests Docs Downloads Maintainability Test Coverage Code style: black


Features

csnlp builds on top of the CasADi framework [1] to model the optimization problems and perform symbolic differentiation, and heavily relies on the IPOPT solver [2] (though the package allows the user to seamlessly switch to other solvers supported by CasADi). While it is similar in functionality (and was inspired by) the CasADi's Opti Stack (see this blog post for example), it is more tailored to research as

  1. it is more flexible, since it is written in Python and allows the user to easily access all the constituents of the optimization problem (e.g. the objective function, constraints, dual variables, bounds, etc.)

  2. it is more modular, since it allows the base csnlp.Nlp class to be wrapped with additional functionality (e.g. sensitivity, Model Predictive Control, etc.), and it provides parallel implementations in case of multistarting in the csnlp.multistart module.

The package offers also tools for the sensitivity analysis of NLPs, solving them with multiple initial conditions, as well as for building MPC controllers. The library is not meant to be a faster alternative to casadi.Opti, but rather a more flexible and modular one for research purposes.


Installation

Using pip

You can use pip to install csnlp with the command

pip install csnlp

csnlp has the following dependencies

Using source code

If you'd like to play around with the source code instead, run

git clone https://github.com/FilippoAiraldi/casadi-nlp.git

The main branch contains the main releases of the packages (and the occasional post release). The experimental branch is reserved for the implementation and test of new features and hosts the release candidates. You can then install the package to edit it as you wish as

pip install -e /path/to/casadi-nlp

Getting started

Here we provide a compact example on how csnlp can be employed to build and solve an optimization problem. Similar to Opti, we instantiate a class which represents the NLP and allows us to create its variables and parameters and model its constraints and objective. For example, suppose we'd like to solve the problem

$$ \min_{x,y}{ (1 - x)^2 + 0.2(y - x^2)^2 \text{ s.t. } (p/2)^2 \le (x + 0.5)^2 + y^2 \le p^2 } $$

We can do so with the following code:

from csnlp import Nlp

nlp = Nlp()
x = nlp.variable("x")[0]  # create primal variable x
y = nlp.variable("y")[0]  # create primal variable y
p = nlp.parameter("p")  # create parameter p

# define the objective and constraints
nlp.minimize((1 - x) ** 2 + 0.2 * (y - x**2) ** 2)
g = (x + 0.5) ** 2 + y**2
nlp.constraint("c1", (p / 2) ** 2, "<=", g)
nlp.constraint("c2", g, "<=", p**2)

nlp.init_solver()  # initializes IPOPT under the hood
sol = nlp.solve(pars={"p": 1.25})  # solves the NLP for parameter p=1.25

x_opt = sol.vals["x"]   # optimal values can be retrieved via the dict .vals
y_opt = sol.value(y)  # or the .value method

However, the package also allows to seamlessly enhance the standard csnlp.Nlp with different capabilities. For instance, when the problem is highly nonlinear and necessitates to be solved with multiple initial conditions, the csnlp.multistart module offers various solutions to parallelize the computations (see, e.g., csnlp.multistart.ParallelMultistartNlp). The csnlp.wrappers module offers instead a set of wrappers that can be used to augment the NLP with additional capabilities without modifying the original NLP instance: as of now, wrappers have been implemented for

The package also allows to enhance the NLP with different capabilities with, e.g., multistart (see csnlp.MultistartNlp) or by wrapping it. As of now, wrappers have been implemented for

  • sensitivity analysis (see csnlp.wrappers.NlpSensitivity [3])
  • Model Predictive Control (see csnlp.wrappers.Mpc [4] and csnlp.wrappers.ScenarioBasedMpc [5])
  • NLP scaling (see csnlp.wrappers.NlpScaling and csnlp.core.scaling).

For example, if we'd like to compute the sensitivity $\frac{\partial y}{\partial p}$ of the optimal primal variable $y$ with respect to the parameter $p$, we just need to wrap the csnlp.Nlp instance with the csnlp.wrappers.NlpSensitivity wrapper, which is specialized in differentiating the optimization problem. This in turn allows us to compute the first-order $\frac{\partial y}{\partial p}$ and second sensitivities $\frac{\partial^2 y}{\partial p^2}$ (dydp and d2ydp2, respectively) as such:

from csnlp import wrappers

nlp = wrappers.NlpSensitivity(nlp)
dydp, d2ydp2 = nlp.parametric_sensitivity()

In other words, these sensitivities provide the jacobian and hessian that locally approximate the solution w.r.t. the parameter $p$. As shown in the corresponding example but not in this quick demonstation, the sensitivity can be also computed for any generic expression $z(x(p),\lambda(p),p)$ that is a function of the primal $x$ and dual $\lambda$ variables, and the parameters $p$. Moreover, the sensitivity computations can be carried out symbolically (more demanding) or numerically (more stable and reliable).

Similarly, a csnlp.Nlp instance can be wrapped in a csnlp.wrappers.Mpc wrapper that makes it easier to build such finite-horizon optimal controllers for model-based control applications.


Examples

Our examples subdirectory contains example applications of this package in NLP optimization, sensitivity analysis, scaling of NLPs, and optimal control.


License

The repository is provided under the MIT License. See the LICENSE file included with this repository.


Author

Filippo Airaldi, PhD Candidate [f.airaldi@tudelft.nl | filippoairaldi@gmail.com]

Delft Center for Systems and Control in Delft University of Technology

Copyright (c) 2024 Filippo Airaldi.

Copyright notice: Technische Universiteit Delft hereby disclaims all copyright interest in the program “csnlp” (Nonlinear Progamming with CasADi) written by the Author(s). Prof. Dr. Ir. Fred van Keulen, Dean of ME.


References

[1] Andersson, J.A.E., Gillis, J., Horn, G., Rawlings, J.B., and Diehl, M. (2019). CasADi: a software framework for nonlinear optimization and optimal control. Mathematical Programming Computation, 11(1), 1–36.

[2] Wachter, A. and Biegler, L.T. (2006). On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming. Mathematical Programming, 106(1), 25–57.

[3] Büskens, C. and Maurer, H. (2001). Sensitivity analysis and real-time optimization of parametric nonlinear programming problems. In M. Grötschel, S.O. Krumke, and J. Rambau (eds.), Online Optimization of Large Scale Systems, 3–16. Springer, Berlin, Heidelberg

[4] Rawlings, J.B., Mayne, D.Q. and Diehl, M., 2017. Model Predictive Control: theory, computation, and design (Vol. 2). Madison, WI: Nob Hill Publishing.

[5] Schildbach, G., Fagiano, L., Frei, C. and Morari, M., 2014. The Scenario Approach for stochastic Model Predictive Control with bounds on closed-loop constraint violations. Automatica, 50(12), pp.3009-3018.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

csnlp-1.6.1rc1.tar.gz (85.1 kB view hashes)

Uploaded Source

Built Distribution

csnlp-1.6.1rc1-py3-none-any.whl (68.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page