HGF Toolbox v4.0

Version 4.0 of the HGF Toolbox has been released.

The HGF Toolbox implements many variants of the hierarchical Gaussian filter (HGF) and many other models used in time-series modeling, such as hidden Markov models, hierarchical hidden Markov Models, Rescorla-Wagner, etc.

The main highlights of this release are

- the new PDF documentation,

- the new interactive demo, and

- the greater ease of configuration.

Configuration is now easier because of the replacement of the parameter theta in the HGF models by its log-transformed equivalent, omega_n, where n is the number of levels in the model. This parameter is estimated in native space, so no more setting of upper bounds is needed. Similarly, all kappas are now estimated in log-space (instead of logit-space), so that they too no longer require upper bounds.

Many additional improvements have been made behind the scenes, and many models have been added, most importantly hidden Markov models (HMMs) and hierarchical HMMs. The full release notes are below.

The HGF is a generic Bayesian hierarchical model for inference on a changing environment based on sequential input. This makes it a general model of learning in discrete time. It was introduced in

Mathys C, Daunizeau J, Friston KJ, Stephan KE (2011). A Bayesian foundation for individual learning under uncertainty. Frontiers in Human Neuroscience. 5:39. doi:10.3389/fnhum.2011.00039

and is explained in more detail in

Mathys C, Lomakina EI, Daunizeau J, Iglesias S, Brodersen KH, Friston KJ, & Stephan KE (2014). Uncertainty in perception and the Hierarchical Gaussian Filter. Frontiers in Human Neuroscience, 8:825. doi:10.3389/fnhum.2014.00825

After downloading, unzip the toolbox and read the Manual.pdf file.

 

Release Notes
———————–
- Added PDF manual
- Added interactive demo in hgf_demo
- Added file of raw commands from hgf_demo in hgf_demo_commands
- Adapted fitModel to calculate AIC and BIC
- Renamed F (negative variational free energy) to LME (log-model evidence, to
which it is an approximation)
- Calculate accuracy and complexity in fitModel
- Save everything relating to model quality under r.optim
- Improved output of fitModel
- Added hierarchical hidden Markov model (hhmm)
- Added hidden Markov model (hmm)
- Added WhatWorld (hgf_whatworld) model
- Added linear log-reaction time (logrt_linear_whatworld) model for WhatWorld
- Added WhichWorld (hgf_whichworld) model
- Added AR(1) model for binary outcomes (hgf_ar1_binary)
- Added Jumping Gaussian Estimation (hgf_jget) model
- Added unitsq_sgm_mu3 decision model
- Added binary multi-armed bandit model hgf_binary_mab
- Added beta_obs observation model for decision noise on the unit interval
- Added softmax decision model with different inverse temperature for each
kind of binary decision (softmax_2beta)
- Added logrt_linear_binary decision model
- Added Rescorla-Wagner model with different learning rate for each kind of
binary outcome (rw_binary_dual)
- Included additional trajectories in output of hgf, hgf_ar1, hgf_ar1_mab,
hgf_binary, hgf_ar1_binary, hgf_binary_mab, hgf_whichworld, and
hgf_whatworld
- Made infStates more consistent across models
- Removed deprecated hgf_binary3l
- Made fitModel explicitly return negative log-joint probability and negative
log-likelihood
- Modified simModel to read configuration files of perceptual and observation
models
- Abolished theta in hgf, hgf_binary, hgf_ar1, hgf_ar1_mab, hgf_ar1_binary,
hgf_binary_mab, and hgf_jget
- Moved kappa estimation from logit-space to log-space for hgf, hgf_binary,
hgf_ar1, hgf_ar1_mab, hgf_ar1_binary, hgf_binary_mab, and hgf_jget
- Introduced checking for implausible jumps in trajectories for hgf,
hgf_binary, hgf_ar1, hgf_ar1_mab, hgf_ar1_binary, hgf_binary_mab, and
hgf_jget
- Adapted fitModel to deal with cases the prc_model_transp() function
performs operations important to the prc_model() function
- Introduced multinomial softmax decision model
- Improved documentation for hgf_ar1_mab model
- Added error IDs for all errors