infer icon indicating copy to clipboard operation
infer copied to clipboard

Memory leak issue with InferenceEngine for Bayesian network

Open MengYao-astro opened this issue 10 months ago • 6 comments

My current code structure is to create a Static InferenceEngine instance. And then, In a for-loop: I initiate several variables with: Variable.Bernoulli(n).Named("xxx") as "Prior". Then I set soft evidence with: Variable.ConstrainEqualRandom Finally, I do the infer with engine.infer()

In this approach, I come across a memory leak issue where I observed memory usage increase over iterations. Am I using it correctly? Is there any solution to this one?

MengYao-astro avatar Mar 07 '25 01:03 MengYao-astro

If you are using the same InferenceEngine throughout, then this is expected since it thinks you are adding new variables to one giant model. If you want to create separate models with no variables in common, then create separate InferenceEngine objects for each.

tminka avatar Mar 11 '25 12:03 tminka

First of all, thank you so much for replying!

Previously, I create new inferenceEngine instance every time I want to do the inference, then add variables, and then infer. I found the memory leak issue so I thought it was not disposed properly and then I changed to static engine instance.

The problem I am trying to solve is I need to initialize my model to default as prior, and then set soft evidence to it, and finally make the inference on one node. Each time's inference should be independent. What is the correct way of using the framework? Here is some pseudo code of my previous appoarch:

for (int i = 0; i < n; i ++; ) { new InferenceEngine; new Variable.Bernoulli A new Varibale B using(A) { B.SetTo()} new Varibale C using(B) { C.SetTo()} //Set soft evidence: Variable.ConstrainEqual(A, xxx) Variable.ConstrainEqual(C, xxx) //Infer engine.infer(B) }

Please let me know if you need some real code for you to easily compile.

MengYao-astro avatar Mar 12 '25 04:03 MengYao-astro

If you have many models with the same structure, then you should use ObservedValues to merge them into one model, as explained in the TruncatedGaussian example. This will be significantly more efficient and won't leak memory.

tminka avatar Mar 13 '25 11:03 tminka

Thank you so much for the comment! It helps a lot!

MengYao-astro avatar Mar 14 '25 08:03 MengYao-astro

BTW I understand that ObservedValue can be used like this. But in my project, I need to set soft evidence where I don't have a certain answer on my pior Variable.Bernoulli(0.n), which is true or false. I only know that it is a new Variable.Bernoulli(0.y). Is there any way I can build my model effciently?

MengYao-astro avatar Mar 14 '25 08:03 MengYao-astro

If you are using Variable.ConstrainEqualRandom(x, dist) to attach soft evidence, you can make dist a Variable whose ObservedValue changes. Setting dist.ObservedValue to uniform will remove the soft evidence. For examples of this, see the Click model example, the Crowdsourcing example, and the Classifier/Recommender Learners.

tminka avatar Mar 14 '25 17:03 tminka