HGF for sensor fusion
looking for example of HGF that combines two first level filters in two a single level on the higher level, aka sensor fusion. For example, we have GPS data and Accelerameter data coming at the same time. Each can give an estimate about position. Is there anything to use HGF to model this?
Another question is that, if the state is multidimensional and we are using a regular, single branch standard HGF, in the first layer we are estimating a covariance matrix P; How is the second layer couples into this P? can it distinguish and apply each element of this precision or covariance matrix, or it scales everything uniformly?
Hey @navidivan ! Nice to meet you :)
My apologies for the slow response - I hope you haven't been waiting too long.
Sensor fusion is certainly possible wit the generalized HGF, yes. In the simplest case, you just treat the different inputs as independent observations of the same underlying state. we have an example of this in the generalized HGF preprint:
Here, additionally, there is noise learning for each sensor so that it will balance the two inputs against each other according to their noise levels. Of course, you might need to preprocess the GPS an Accelerometer data for it to be modelable as direct Gaussian noise observation of the position - I'm not familiar enough with thew data types to judge this off the bat. This model - or variations of it - would be easily and immediately implementable.
In the above example, we are not estimating a covariance matrix. That is possible, however: probably the most out-of-the-box solution would be to use filtering of exponential family distribution sufficient statistics (which include the means and the covariance matrix). The python sister-package pyhgf, led by @LegrandNico, has the most developed documentation for this https://computationalpsychiatry.github.io/pyhgf/notebooks/0.3-Generalised_filtering.html#generalised-filtering Would this be what you mean? It is not implemented in the Julia package yet, but it shouldn't be too difficult to add. You are also welcome to have a go at implementing it yourself. Or you can use the Python package, of course.
Finally, there is work on a multivariate HGF, where each node is a multivariate Gaussian random walk. The math should be ready, but it isn't implemented anywhere yet - I also am unsure if it wold in practice differ much from the above, except that the distribution being learnt would be a random walk instead of a normal Gaussian with random-walking parameters. If you are interested, I can perhaps find the mathematics for this for you.
I hope this helps! Let me know if there is anything else I can do :)