beluga icon indicating copy to clipboard operation
beluga copied to clipboard

Add support for mixture distributions

Open hidmic opened this issue 5 months ago • 3 comments

Feature description

Precisely what the title says. A form of soft clustering, where the filter works with a distribution of distributions (or clusters), with provisions to split and merge distributions. This is immediately useful to some filters (like MH-AMCL), it is relevant to earlier discussions we've had about clustering (#172), and (if properly thought through) it can pave the way for filters based on GMMs.

Implementation considerations

It'd be good to think of the cluster distributions as potentially parametric.

hidmic avatar Sep 23 '25 12:09 hidmic

FYI @JesusSilvaUtrera

hidmic avatar Sep 23 '25 12:09 hidmic

@hidmic I have been thinking about which components or steps to follow to accomplish this, as it's not so trivial. To start the discussion, I will share some ideas here:

  • First, create a BaseHypothesis abstract interface for a single distribution (a cluster). The contents of this interface are not 100% clear to me, since I am not sure if it should be treated like a ParticlesDistribution directly, or we would need a more generic approach. We would also need to add a generic way for generating the "quality" of each hypothesis (I use quality here because it's the name used in MH-AMCL, but I don't mean we are using that).
  • This BaseHypothesis could include a concrete default implementation ParticleHypothesis, the one used when we have MH-AMCL for example (same way as we want to do with the BaseMapMatcher).
  • Then, we would need something like a MultiHypothesesModel (probably not the best name 😅) container, which would be a class holding a collection of the hypotheses, and orchestrating the update process for the hypotheses.
  • We would also need to somehow add the logic from manage_hypotheses method. This can be done with a range action in Beluga's core in my opinion. This would be the responsible for splitting, merging, pruning and selecting the best hypothesis.

Let me know what you think about this, thanks!

JesusSilvaUtrera avatar Oct 15 '25 12:10 JesusSilvaUtrera

@JesusSilvaUtrera this one's a bit trickier. We use a generic container of particles to describe a non-parametric distribution, so the dumb extension for a weighted collection of non-parametric distributions would be a generic container of weighted containers, like:

struct Particle {
   Sophus::SE2d state;
   double weight;
};

struct Distribution {  // or another name
   std::vector<Particle> particles;
   double weight;  // or quality
};

The interesting thing about this is that we can code operations like merging and pruning on a generic distribution concept so that it doesn't matter what state you are tracking underneath (and in the limit, it doesn't matter how you represent your distribution, but that's a stretch now, ignore me).

hidmic avatar Oct 15 '25 17:10 hidmic