Skip to main content

cut pursuit algorithms

Project description

Cut-Pursuit Algorithms, Parallelized Along Components

Generic C++ classes for implementing cut-pursuit algorithms.
Specialization to convex problems involving graph total variation, and nonconvex problems involving contour length, as explained in our articles (Landrieu and Obozinski, 2016; Raguet and Landrieu, 2018).
Parallel implementation with OpenMP.
MEX interfaces for GNU Octave or Matlab.
Extension modules for Python.

Table of Content

  1. General problem statement
  2. C++ classes and Specializations
    2.1. Proximity operator of the graph total variation
    2.2. Quadratic functional and graph total variation
    2.3. Separable multidimensional loss and graph total variation
    2.4. Separable distance and contour length
  3. Documentation
    3.1. Directory tree
    3.2. Graph structure
    3.3. C++ documentation
    3.4. GNU Octave or Matlab
    3.5. Python
  4. References
  5. License

General problem statement

The cut-pursuit algorithms minimize functionals structured, over a weighted graph G = (V, E, w), as

    F: x ∈ ΩVf(x) + ∑(u,v) ∈ E w(u,v) ψ(xu, xv) ,

where Ω is some base set, and the functional ψ: Ω² → ℝ penalizes dissimilarity between its arguments, in order to enforce solutions which are piecewise constant along the graph G.

The cut-pursuit approach is to seek partitions V of the set of vertices V, constituting the constant connected components of the solution, by successively solving the corresponding problem, structured over the reduced graph G = (V, E), that is

  arg minξ ∈ ΩV  F(x) ,    such that ∀ UV, ∀ uU, xu = ξU ,

and then refining the partition.
A key requirement is thus the ability to solve the reduced problem, which often have the exact same structure as the original one, but with much less vertices |V| ≪ |V|. If the solution of the original problem has only few constant connected components in comparison to the number of vertices, the cut-pursuit strategy can speed-up minimization by several orders of magnitude.

Cut-pursuit algorithms come in two main flavors, namely “directionally differentiable” and “noncontinuous”.

  • In the directionally differentiable case, the base set Ω is typically a vector space, and it is required that f is differentiable, or at least that its nondifferentiable part is separable along the graph and admits (potentially infinite) directional derivatives. This comprises notably many convex problems, where ψ(xu, xv) = ║xuxv║, that is to say involving a graph total variation. The refinement of the partition is based on the search for a steep directional derivative, and the reduced problem is solved using convex or continuous optimization; optimality guarantees can be provided.

  • In the noncontinuous case, the dissimilarity penalization typically uses ψ(xu, xv) = 0 if xu =xv, 1 otherwise, resulting in a measure of the contour length of the constant connected components. The functional f is typically required to be separable along the graph, and to have computational properties favorable enough for solving reduced problems. The refinement of the partition relies on greedy heuristics.

Both flavors admit multidimensional extensions, that is to say Ω is not required to be only a set of scalars.

C++ classes and Specializations

The class Cp_graph is a modification of the Graph class of Y. Boykov and V. Kolmogorov, for making use of their maximum flow algorithm.
The class Cp is the most generic, defining all steps of the cut-pursuit approach in virtual methods.
The class Cp_d1 specializes methods for directionally differentiable cases involving the graph total variation.
The class Cp_d0 specializes methods for noncontinuous cases involving the contour length penalization.

Cp_prox_tv: proximity operator of the graph total variation

Also coined “graph total variation denoising” or “general fused LASSO signal approximation”. The base set is Ω = ℝ, and the objective functional is

    F: x ∈ ℝV ↦ 1/2 ║yx2 + ∑(u,v) ∈ E w(u,v) |xuxv| ,

where y ∈ ℝn and w ∈ ℝE are regularization weights.

Cp_d1_ql1b: quadratic functional, ℓ1 norm, bounds, and graph total variation

The base set is Ω = ℝ, and the general form is

    F: x ∈ ℝV ↦ 1/2 ║y(q)Ax2 + ∑vV λv |y(ℓ1)xv| + ∑vV ι[mv, Mv](xv)
                 + ∑(u,v) ∈ E w(u,v) |xuxv| ,

where y(q) ∈ ℝn, A: ℝn → ℝV is a linear operator, y(ℓ1) ∈ ℝV and λ ∈ ℝV and w ∈ ℝE are regularization weights, m, M ∈ ℝV are parameters and ι[a,b] is the convex indicator of [a, b] : x ↦ 0 if x ∈ [a, b], +∞ otherwise.

When y(ℓ1) is zero, the combination of ℓ1 norm and total variation is sometimes coined fused LASSO.

When A is the identity, λ is zero and there are no box constraints, the problem boils down to the proximity operator of the graph total variation.

Currently, A must be provided as a matrix. See the documentation for special cases.

The reduced problem is solved using the preconditioned forward-Douglas–Rachford splitting algorithm (see also the corresponding repository).

Two examples where A is a full ill-conditioned matrix are provided with GNU Octave or Matlab and Python interfaces: one with positivity and fused LASSO constraints on a task of brain source identification from electroencephalography, and another with boundary constraints on a task of image reconstruction from tomography.

ground truth raw retrieved activity identified sources

Specialization Cp_d1_lsx: separable loss, simplex constraints, and graph total variation

The base set is Ω = ℝD, where D can be seen as a set of labels, and the general form is

    F: x ∈ ℝVDf(y, x) + ∑vV ιΔD(xv) + ∑(u,v) ∈ E w(d1)(u,v)dD λd |xu,dxv,d| ,

where y ∈ ℝVD, f is a loss functional (see below), w(d1) ∈ ℝE and λ ∈ ℝD are regularization weights, and ιΔD is the convex indicator of the simplex ΔD = {x ∈ ℝD | ∑d xd = 1 and ∀ d, xd ≥ 0}: x ↦ 0 if x ∈ ΔD, +∞ otherwise.

The following loss functionals are available, where w(f) ∈ ℝV are weights on vertices.
Linear: f(y, x) = − ∑vV w(f)vdD xv,d yv,d
Quadratic: f(y, x) = ∑vV w(f)vdD (xv,dyv,d)2
Smoothed Kullback–Leibler divergence (equivalent to cross-entropy):
f(y, x) = ∑vV w(f)v KL(α u + (1 − α) yv, α u + (1 − α) xv),
where α ∈ ]0,1[, u ∈ ΔD is the uniform discrete distribution over D, and KL: (p, q) ↦ ∑dD pd log(pd/qd).

The reduced problem is also solved using the preconditioned forward-Douglas–Rachford splitting algorithm (see also the corresponding repository).

An example with the smoothed Kullback–Leibler is provided with GNU Octave or Matlab and Python interfaces, on a task of spatial regularization of semantic classification of a 3D point cloud.

ground truth random forest classifier regularized classification

Specialization Cp_d0_dist: separable distance and weighted contour length

The base set is Ω = ℝD or ΔD and the general form is

    F: x ∈ ℝVDf(y, x) + ∑(u,v) ∈ E w(d0)(u,v)xuxv0 ,

where y ∈ ΩV, f is a loss functional akin to a distance (see below), and ║·║0 is the ℓ0 pseudo-norm x ↦ 0 if x = 0, 1 otherwise.

The following loss functionals are available, where w(f) ∈ ℝV are weights on vertices and m(f) ∈ ℝD are weights on coordinates.
Weighted quadratic: Ω = ℝD and f(y, x) = ∑vV w(f)vdD m(f)d (xv,dyv,d)2
Weighted smoothed Kullback–Leibler divergence (equivalent to cross-entropy): Ω = ΔD and
f(y, x) = ∑vV w(f)v KLm(f)(α u + (1 − α) yv, α u + (1 − α) xv),
where α ∈ ]0,1[, u ∈ ΔD is the uniform discrete distribution over D, and
KLm(f): (p, q) ↦ ∑dD m(f)d pd log(pd/qd).

The reduced problem amounts to averaging, and the split step uses k-means++ algorithm.

When the loss is quadratic, the resulting problem is sometimes coined “minimal partition problem”.

An example with the smoothed Kullback–Leibler is provided with GNU Octave or Matlab interface, on a task of spatial regularization of semantic classification of a 3D point cloud.

Documentation

Graph structure

Graph structures must be given as forward-star representation. For conversion from simple adjacency list representation, or for creation from scratch for regular N-dimensionnal grids (2D for images, 3D for volumes, etc.), see my grid graph repository.

Directory tree

.   
├── data/         various data for illustration
├── include/      C++ headers, with some doc  
├── octave/       GNU Octave or Matlab code  
│   ├── doc/      some documentation  
│   └── mex/      MEX C++ interfaces
├── python/       Python code  
│   ├── cpython/  C Python interfaces  
│   └── wrappers/ python wrappers and documentation  
└── src/          C++ sources  

C++ documentation

Requires C++11.
Be sure to have OpenMP enabled with your compiler to enjoy parallelization. Note that, as of 2020, MSVC still does not support OpenMP 3.0 (published in 2008); consider switching to a decent compiler.

The number of parallel threads used in parallel regions is crucial for good performance; it is roughly controlled by a macro MIN_OPS_PER_THREAD which can be set by usual D compilation flag. A rule of thumb is to set it to 10000 on personal computers with a handful of cores, and up to 100000 for large computer clusters with tens of cores.

The C++ classes are documented within the corresponding headers in include/.

GNU Octave or Matlab

See the script compile_mex.m for typical compilation commands; it can be run directly from the GNU Octave interpreter, but Matlab users must set compilation flags directly on the command line CXXFLAGS = ... and LDFLAGS = ....

Extensive documentation of the MEX interfaces can be found within dedicated .m files in octave/doc/.

The script example_EEG.m exemplifies the use of Cp_d1_ql1b, on a task of brain source identification from electroencephalography.

The script example_tomography.m exemplifies the use of Cp_d1_ql1b, on a task of image reconstruction from tomography.

The scripts example_labeling_3D.m and example_labeling_3D_d0.m exemplify the use of, respectively, Cp_d1_lsx and Cp_d0_dist, on a task of spatial regularization of semantic classification of a 3D point cloud.

Python

Requires numpy package.
See the script setup.py for compiling modules with setuptools; it can be run simply by using pip e.g. python -m pip install .. pre compiled binaries for Windows and Linux will soon be available on PyPI. if more than 65535 components are expected in you graph you can force the use of 32 bit indices by setting the CP_NPY_COMP_32 environement variable to 1 e.g. export CP_NPY_COMP_32=1 on bash.

Extensive documentation of the Python wrappers can be found in the corresponding .py files.
The scripts are mostly written for Python 3, and should work with Python 2 with minor tweaking.

The script example_EEG.py exemplifies the use of Cp_d1_ql1b, on a task of brain source identification from electroencephalography.

The script example_tomography.py exemplifies the use of Cp_d1_ql1b, on a task of image reconstruction from tomography.

The scripts example_labeling_3D.py and example_labeling_3D_d0.py exemplify the use of, respectively, Cp_d1_lsx and Cp_d0_dist, on a task of spatial regularization of semantic classification of a 3D point cloud.

References

L. Landrieu and G. Obozinski, Cut Pursuit: Fast Algorithms to Learn Piecewise Constant Functions on Weighted Graphs, 2017.

H. Raguet and L. Landrieu, Cut-pursuit Algorithm for Regularizing Nonsmooth Functionals with Graph Total Variation, 2018.

Y. Boykov and V. Kolmogorov, An Experimental Comparison of Min-Cut/Max-Flow Algorithms for Energy Minimization in Vision, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004.

License

This software is under the GPLv3 license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycut-pursuit-0.0.1.tar.gz (133.5 kB view hashes)

Uploaded Source

Built Distributions

pycut_pursuit-0.0.1-cp311-cp311-win_amd64.whl (248.9 kB view hashes)

Uploaded CPython 3.11 Windows x86-64

pycut_pursuit-0.0.1-cp311-cp311-win32.whl (221.3 kB view hashes)

Uploaded CPython 3.11 Windows x86

pycut_pursuit-0.0.1-cp311-cp311-musllinux_1_1_x86_64.whl (4.0 MB view hashes)

Uploaded CPython 3.11 musllinux: musl 1.1+ x86-64

pycut_pursuit-0.0.1-cp311-cp311-musllinux_1_1_i686.whl (3.9 MB view hashes)

Uploaded CPython 3.11 musllinux: musl 1.1+ i686

pycut_pursuit-0.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.3 MB view hashes)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

pycut_pursuit-0.0.1-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl (3.3 MB view hashes)

Uploaded CPython 3.11 manylinux: glibc 2.17+ i686

pycut_pursuit-0.0.1-cp310-cp310-win_amd64.whl (248.9 kB view hashes)

Uploaded CPython 3.10 Windows x86-64

pycut_pursuit-0.0.1-cp310-cp310-win32.whl (221.2 kB view hashes)

Uploaded CPython 3.10 Windows x86

pycut_pursuit-0.0.1-cp310-cp310-musllinux_1_1_x86_64.whl (4.0 MB view hashes)

Uploaded CPython 3.10 musllinux: musl 1.1+ x86-64

pycut_pursuit-0.0.1-cp310-cp310-musllinux_1_1_i686.whl (3.9 MB view hashes)

Uploaded CPython 3.10 musllinux: musl 1.1+ i686

pycut_pursuit-0.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.3 MB view hashes)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

pycut_pursuit-0.0.1-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl (3.3 MB view hashes)

Uploaded CPython 3.10 manylinux: glibc 2.17+ i686

pycut_pursuit-0.0.1-cp39-cp39-win_amd64.whl (248.9 kB view hashes)

Uploaded CPython 3.9 Windows x86-64

pycut_pursuit-0.0.1-cp39-cp39-win32.whl (221.2 kB view hashes)

Uploaded CPython 3.9 Windows x86

pycut_pursuit-0.0.1-cp39-cp39-musllinux_1_1_x86_64.whl (4.0 MB view hashes)

Uploaded CPython 3.9 musllinux: musl 1.1+ x86-64

pycut_pursuit-0.0.1-cp39-cp39-musllinux_1_1_i686.whl (3.9 MB view hashes)

Uploaded CPython 3.9 musllinux: musl 1.1+ i686

pycut_pursuit-0.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.3 MB view hashes)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

pycut_pursuit-0.0.1-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl (3.3 MB view hashes)

Uploaded CPython 3.9 manylinux: glibc 2.17+ i686

pycut_pursuit-0.0.1-cp38-cp38-win_amd64.whl (248.9 kB view hashes)

Uploaded CPython 3.8 Windows x86-64

pycut_pursuit-0.0.1-cp38-cp38-win32.whl (221.3 kB view hashes)

Uploaded CPython 3.8 Windows x86

pycut_pursuit-0.0.1-cp38-cp38-musllinux_1_1_x86_64.whl (4.0 MB view hashes)

Uploaded CPython 3.8 musllinux: musl 1.1+ x86-64

pycut_pursuit-0.0.1-cp38-cp38-musllinux_1_1_i686.whl (3.9 MB view hashes)

Uploaded CPython 3.8 musllinux: musl 1.1+ i686

pycut_pursuit-0.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.3 MB view hashes)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

pycut_pursuit-0.0.1-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl (3.3 MB view hashes)

Uploaded CPython 3.8 manylinux: glibc 2.17+ i686

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page