Skip to main content

Frequency-domain (non)linear state-space identification using JAX.

Project description

freq-statespace

A flexible JAX-based package for nonlinear state-space identification using frequency-domain optimization techniques, focusing on the nonlinear Linear Fractional Representation (NL-LFR) model structure, which combines an LTI system with a static feedback nonlinearity. This internal feedback formulation allows for capturing many complex real-world dynamics, with other popular block-oriented structures such as Wiener, Hammerstein, and Wiener-Hammerstein models arising as special cases. Identification of standard linear state-space models is also supported.

Basic usage

The package works with (multiple periods and realizations of) input–output data sequences $u(n)$ and $y(n)$ for $n = 0, \ldots, N-1$, assuming periodic excitation and an integer number of steady-state output periods. The specific NL-LFR structure is defined as:

  \begin{align*}
    x(n+1) &= A x(n) + B_u u(n) + B_w w(n),\\
    y(n) &= C_y x(n) + D_{yu} u(n) + D_{yw} w(n),\\
    z(n) &= C_z x(n) + D_{zu} u(n),\\ 
    w(n) &= f\big(z(n)\big),
  \end{align*}

consisting of linear state-space matrices and a static nonlinear function approximator $f(\cdot)$.

A typical step-wise identification procedure is as follows:

  1. Best Linear Approximation (BLA) parametrization. Initializes the matrices $A$, $B_u$, $C_y$ and $D_{yu}$ using the frequency-domain subspace method, and refines these estimates through iterative optimization. If you're only interested in linear state-space models, you can stop the identification process here.
  2. NL-LFR initialization. Applies the frequency-domain inference and learning method to efficiently initialize the remaining model parameters while keeping the BLA parameters fixed. This step requires that $f(\cdot)$ is linear in the parameters, i.e., $f(\cdot)=\beta^\top\phi(\cdot)$, with $\beta$ the parameter vector and $\phi(\cdot)$ the nonlinear feature mapping (e.g., polynomial features).
  3. NL-LFR optimization. Performs iterative refinement of all model parameters using time-domain simulations. This is the most computationally demanding step, mainly due to the sequential nature of the forward simulations. Fortunately, the previous steps should have provided an initialization that is already close to a good local minimum.

It is also possible to skip the inference and learning step and go straight to nonlinear optimization. An advantage of this approach is that it puts no restriction on the structure of $f(\cdot)$, i.e., it does not require a model that is linear in its parameters.

Features

  • Provides a user-friendly interface for identifying linear state-space models using frequency-domain subspace estimation based on the nonparametric BLA.
  • Offers two workflows for identifying nonlinear LFR state-space models by primarily exploiting a frequency-domain formulation that enables inherent parallelism.
  • Uses JAX for automatic differentiation, JIT compilation, and GPU/TPU acceleration.
  • Supports Optimistix solvers (Levenberg-Marquardt, BFGS, ...) for typical system identification problems.
  • Supports Optax optimizers (Adam, SGD, ...) for large-scale optimization.

Installation

Requires Python 3.10 or newer:

pip install freq-statespace

If JAX isn't already installed in your environment, the above command will install the CPU-only version. For GPU/TPU support (strongly recommended, often many times faster for mid-size to large problems), follow the JAX installation guide.

Quick example

We show an exemplary training pipeline on the Silverbox benchmark dataset, containing input-output measurements from an electronic circuit that mimics a mass-spring-damper system with a cubic spring nonlinearity.

We first estimate the BLA:

import freq_statespace as fss

data = fss.load_and_preprocess_silverbox_data()  # 8192 x 6 samples

# Step 1: BLA estimation
nx = 2  # state dimension
bla = fss.lin.subspace_id(data, nx)  # NRMSE 18.36%, non-iterative
bla = fss.lin.optimize(bla, data)  # NRMSE 13.17%, 6 iters, 1.32ms/iter

Next, we proceed with inference and learning, followed by full nonlinear optimization:

# Step 2: Inference and learning
nw, nz = 1, 1  # internal signal dimensions
phi = fss.static.basis.Polynomial(nz, degree=3)
nllfr = fss.nonlin.inference_and_learning(bla, data, phi, nw)  # NRMSE 1.11%, 45 iters, 18.4ms/iter

# Step 3: Nonlinear optimization
nllfr = fss.nonlin.optimize(nllfr, data)  # NRMSE 0.44%, 100 iters, 387ms/iter

Alternatively, we could skip inference and learning and jump straight to nonlinear optimization. In this example we use a neural network:

import jax

# Step 2: Nonlinear optimization
nw, nz = 1, 1  # internal signal dimensions
neural_net = fss.static.NeuralNetwork(nw, nz, layers=1, neurons_per_layer=10, activation=jax.nn.relu)
nllfr = fss.nonlin.connect(bla, neural_net)
nllfr = fss.nonlin.optimize(nllfr, data)  # NRMSE 0.55%, 100 iters, 354ms/iter

Note: Iteration timings were measured on an NVIDIA T600 Laptop GPU.

Serialization of models can be achieved like so:

path = "models/nllfr.zip"
fss.save_model(nllfr, path)
nllfr_loaded = fss.load_model(path)

The examples/ folder also provides Jupyter notebooks for more challenging benchmark systems, with additional notes on hyperparameter tuning and solver configurations.

Preparing your data

Every identification problem starts by casting the time-domain input-output data into the required format and supplying minimal frequency metadata (the excited frequencies and the sampling frequency); the helper function fss.create_data_object(...) is provided for this purpose. Once the data object is instantiated, the workflow proceeds exactly as in the quick example above.

Citation

If you use this code in your work, please cite it as (arXiv link):

@article{floren2025inference,
  title={Inference and Learning of Nonlinear LFR State-Space Models},
  author={Floren, Merijn and No{\"e}l, Jean-Philippe and Swevers, Jan},
  journal={IEEE Control Systems Letters},
  year={2025},
  publisher={IEEE}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

freq_statespace-0.1.2.tar.gz (21.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

freq_statespace-0.1.2-py3-none-any.whl (61.6 kB view details)

Uploaded Python 3

File details

Details for the file freq_statespace-0.1.2.tar.gz.

File metadata

  • Download URL: freq_statespace-0.1.2.tar.gz
  • Upload date:
  • Size: 21.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.12

File hashes

Hashes for freq_statespace-0.1.2.tar.gz
Algorithm Hash digest
SHA256 dbe489c0e0db58d3f5548535bb3fa3b3c6e22db4d179cca8883e482d6d9cf750
MD5 8e3188fd5676f7972ff1714ffe05c942
BLAKE2b-256 4e3222b15b393d12ab12fe978e3de5f8619b0ed213c208bb56fae5ec1d8027a4

See more details on using hashes here.

File details

Details for the file freq_statespace-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for freq_statespace-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 1bfb63efe064bbc83104c97993c341db910a95b3593f13a38015313008e0c9d4
MD5 13da1dcd82c3bd07fd551f82e99f35d2
BLAKE2b-256 af7d7f7a96c0d280f75c1c6ce3a8c63eeb18871d15d314b0f374bccec1bca6b6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page