Skip to main content

statespace models

Project description

pipeline pypi docs

reference problems from

Bayesian Signal Processing: Classical, Modern, and Particle Filtering Methods, James V. Candy

Kalman Filtering: Theory and Practice, Mohinder S. Grewal, Angus P. Andrews

Stochastic Processes and Filtering Theory, Jazwinski

210308 looking more closely at the oregon library documentation, they have an interesting discussion of objectives - something to think about going forward. is a higher-level statespace framework something to consider?

designed to allow reuse of a system state space definition for state, parameter and joint estimation, using a variety of different inference algorithms. In other words, you define your system once in a standard general state space framework, and then the inference framework generator geninfds together with the inference system noise source generator gensysnoiseds will adapt/remap that model into the relevant state space framework needed for whatever type of estimation you want to do. This allows you to focus on defining the model, doing data IO, etc. without having to get bogged down into casting the problem into a different framework each time you want to use a different estimator or want to change the type of inference your doing. I.e. the internal inference implementation is hidden or as transparent as possible with respect to the problem definition by the user. more

210221 brought the documentation via readthedocs up to a minimal level. cleaned up the project and brought some focus to what's going on here. as the docs now make clear - this project focuses on reference problems from Bayesian Signal Processing: Classical, Modern, and Particle Filtering Methods, Kalman Filtering: Theory and Practice, and Stochastic Processes and Filtering Theory - in particular, using numpy for matrix and vector manipulation.

190418 brief lit-review posted on linkedin.

190331 concise motivation piece posted on linkedin.

190310 decision-function-based detector is go. simplest possible case - linear rc-circuit system-model and linear kalman-filter tracker. log-likelihood decision function for detection, ensembles of 100 runs each for signal case and noise case. output curves shown in the first plot - green signal, blue noise-only. roc curves in the second plot.

190223

kl-divergence for evaluating sequential monte-carlo - demonstrated below by three pf's in action during the first second of the jazwinksi problem - start-up and convergence. these are 100 hz dist-curves - each dist-curve is a kernel-density-estimate combining hundreds of monte-carlo samples, the fundamental-particles - green dist-curves for truth, blue dist-curves for pf. state-estimates are two red curves on the x,t-plane beneath the dist-curves.

pf1

pf2

pf3

190105 ukf adaptive jazwinksi switched to square-root filtering, qr-factorization, cholesky-factor update and downdate. improved numerical stability and scaled sampling is clear. still a question around scalar-obs and the obs cholesky-factor and gain. with an adhoc stabilizer on the obs cholesky-factor it's working well overall.

181230 pf adaptive jazwinksi. parameter-roughening.

181226 ukf adaptive jazwinski. sample-and-propagate tuning.

180910 ekf adaptive jazwinski. ud-factorized square-root filtering required for numerical stability.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

statespace-1.3.40.tar.gz (10.5 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page