Skip to main content

The generalized, nodalized HGF for predictive coding.

Project description

hgf

pre-commit license codecov black mypy Imports: isort pip

The multilevel, generalized and nodalized Hierarchical Gaussian Filter for predictive coding

pyhgf is a Python library that implements the generalized, nodalized and multilevel Hierarchical Gaussian Filters for predictive coding written on top of JAX. The library can create and manipulate graph neural networks that perform belief update through the diffusion of precision-weighted prediction errors under new observations. The core functions are derivable, JIT-able, and are designed to interface smoothly with other libraries in the JAX ecosystem for neural networks, reinforcement leanring or Bayesian inference.

Getting started

Installation

The last official release can be downloaded from PIP:

pip install pyhgf

The current version under development can be installed from the master branch of the GitHub folder:

pip install “git+https://github.com/ilabcode/pyhgf.git”

How does it works?

The nodalized Hierarchical Gaussian Filter consists of a network of probabilistic nodes hierarchically structured where each node can inherit its value and volatility sufficient statistics from other parents node. The presentation of a new observation at the lower level of the hierarchy (i.e. the input node) triggers a recursive update of the nodes' belief through the bottom-up propagation of precision-weighted prediction error.

More generally, pyhgf operates on graph neural networks that can be defined and updated through the following variables:

  • The nodes attributes (dictionary) that store each node's parameters (value, precision, learning rates, volatility coupling, ...).
  • The edges (tuple) that lists, for each node, the indexes of the value and volatility parents.
  • A set of update functions that operate on any of the 3 other variables, starting from a target node.
  • An update sequence (tuple) that define the order in which the update functions are called, and the target node.

png

Value parent and volatility parent are nodes themself. Any node can be a value and/or volatility parent for other nodes and have multiple value and/or volatility parents. A filtering structure consists of nodes embedding other nodes hierarchically. Nodes are parametrized by their sufficient statistic and parents. The transformations between nodes can be linear, non-linear, or any function (thus a generalization of the HGF).

The resulting probabilistic network operates as a filter toward new observation. If a decision function (taking the whole model as a parameter) is also defined, behaviors can be triggered accordingly. By comparing those behaviors with actual outcomes, a surprise function can be optimized over the range of parameters of interest.

The Hierarchical Gaussian Filter

The Hierarchical Gaussian Filter for binary and continuous inputs as it was described in Mathys et al. (2011, 2014), and later implemented in the Matlab Tapas toolbox (Frässle et al. 2021), can be seen as a special case of this node structure such as:

Figure2

The pyhgf package includes pre-implemented standard HGF models that can be used together with other neural network libraries of Bayesian inference tools. It is also possible for the user to build custom network structures that would match specific needs.

Model fitting

Here we demonstrate how to fit a two-level binary Hierarchical Gaussian filter. The input time series are the binary outcomes from Iglesias et al. (2013).

from pyhgf.model import HGF
from pyhgf import load_data

# Load time series example data
u, _ = load_data("binary")

# This is where we define all the model parameters - You can control the value of
# different variables at different levels using the corresponding dictionary.
hgf = HGF(
    n_levels=2,
    model_type="binary",
    initial_mean={"1": .0, "2": .5},
    initial_precision={"1": .0, "2": 1e4},
    tonic_volatility={"2": -3.0},
)

# add new observations
hgf.input_data(input_data=u)

# compute the model's surprise (-log(p))
surprise = hgf.surprise()
print(f"Model's surprise = {surprise}")

# visualization of the belief trajectories
hgf.plot_trajectories()

Creating a binary Hierarchical Gaussian Filter with 2 levels.
Add 320 new binary observations.
Model's surprise = 203.29249572753906

png

Acknoledgements

This implementation of the Hierarchical Gaussian Filter was largely inspired by the original Matlab version. A Julia implementation of the generalized, nodalised and multilevel HGF is also available here.

References

  1. Mathys, C. (2011). A Bayesian foundation for individual learning under uncertainty. In Frontiers in Human Neuroscience (Vol. 5). Frontiers Media SA. https://doi.org/10.3389/fnhum.2011.00039
  2. Mathys, C. D., Lomakina, E. I., Daunizeau, J., Iglesias, S., Brodersen, K. H., Friston, K. J., & Stephan, K. E. (2014). Uncertainty in perception and the hierarchical Gaussian filter. Frontiers in Human Neuroscience, 8. https://doi.org/10.3389/fnhum.2014.00825
  3. Powers, A. R., Mathys, C., & Corlett, P. R. (2017). Pavlovian conditioning-induced hallucinations result from overweighting of perceptual priors. Science (New York, N.Y.), 357(6351), 596–600. https://doi.org/10.1126/science.aan3458
  4. Frässle, S., Aponte, E. A., Bollmann, S., Brodersen, K. H., Do, C. T., Harrison, O. K., Harrison, S. J., Heinzle, J., Iglesias, S., Kasper, L., Lomakina, E. I., Mathys, C., Müller-Schrader, M., Pereira, I., Petzschner, F. H., Raman, S., Schöbi, D., Toussaint, B., Weber, L. A., … Stephan, K. E. (2021). TAPAS: An Open-Source Software Package for Translational Neuromodeling and Computational Psychiatry. In Frontiers in Psychiatry (Vol. 12). Frontiers Media SA. https://doi.org/10.3389/fpsyt.2021.680811

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyhgf-0.0.12.tar.gz (60.1 kB view details)

Uploaded Source

Built Distribution

pyhgf-0.0.12-py3-none-any.whl (63.0 kB view details)

Uploaded Python 3

File details

Details for the file pyhgf-0.0.12.tar.gz.

File metadata

  • Download URL: pyhgf-0.0.12.tar.gz
  • Upload date:
  • Size: 60.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for pyhgf-0.0.12.tar.gz
Algorithm Hash digest
SHA256 c168419ce8fa6af4670df2d874871e4966b2e2375eec2f3bdff66cf51875193e
MD5 6aeaf970d6ce3531f098d057a9dff557
BLAKE2b-256 28434ed11611ec2cfeba0a1c7e3a141f93e87e4736daeff0f65f0c0b8be3e1c9

See more details on using hashes here.

File details

Details for the file pyhgf-0.0.12-py3-none-any.whl.

File metadata

  • Download URL: pyhgf-0.0.12-py3-none-any.whl
  • Upload date:
  • Size: 63.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for pyhgf-0.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 2ac5246913177119338f36ef585d07b6338c2b5704b9cb371688d64a35120b5d
MD5 6853a6dfadf80bec66abef03f8bc75ba
BLAKE2b-256 7017a4acc404cf41eff2a375e3f3fbff6f1d6022aae952735e130cb3b872f093

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page