Memory leak issue with InferenceEngine for Bayesian network
My current code structure is to create a Static InferenceEngine instance. And then, In a for-loop: I initiate several variables with: Variable.Bernoulli(n).Named("xxx") as "Prior". Then I set soft evidence with: Variable.ConstrainEqualRandom Finally, I do the infer with engine.infer()
In this approach, I come across a memory leak issue where I observed memory usage increase over iterations. Am I using it correctly? Is there any solution to this one?
If you are using the same InferenceEngine throughout, then this is expected since it thinks you are adding new variables to one giant model. If you want to create separate models with no variables in common, then create separate InferenceEngine objects for each.
First of all, thank you so much for replying!
Previously, I create new inferenceEngine instance every time I want to do the inference, then add variables, and then infer. I found the memory leak issue so I thought it was not disposed properly and then I changed to static engine instance.
The problem I am trying to solve is I need to initialize my model to default as prior, and then set soft evidence to it, and finally make the inference on one node. Each time's inference should be independent. What is the correct way of using the framework? Here is some pseudo code of my previous appoarch:
for (int i = 0; i < n; i ++; )
{
new InferenceEngine;
new Variable.Bernoulli A
new Varibale
Please let me know if you need some real code for you to easily compile.
If you have many models with the same structure, then you should use ObservedValues to merge them into one model, as explained in the TruncatedGaussian example. This will be significantly more efficient and won't leak memory.
Thank you so much for the comment! It helps a lot!
BTW I understand that ObservedValue can be used like this. But in my project, I need to set soft evidence where I don't have a certain answer on my pior Variable.Bernoulli(0.n), which is true or false. I only know that it is a new Variable.Bernoulli(0.y). Is there any way I can build my model effciently?
If you are using Variable.ConstrainEqualRandom(x, dist) to attach soft evidence, you can make dist a Variable whose ObservedValue changes. Setting dist.ObservedValue to uniform will remove the soft evidence. For examples of this, see the Click model example, the Crowdsourcing example, and the Classifier/Recommender Learners.