probability icon indicating copy to clipboard operation
probability copied to clipboard

Memory leak in tfp.math.minimize, reproducible from tutorial notebook

Open melanierbutler opened this issue 11 months ago • 0 comments

Python version: 3.11.6 tensorflow==2.18.0 # via # -r requirements.in # tf-keras tensorflow-probability==0.24.0 # via -r requirements.in tf-keras==2.18.0 # via -r requirements.in

I am trying to perform an analysis built on the multiple changepoint detection and Bayesian model selection code found in this tutorial notebook: https://www.tensorflow.org/probability/examples/Multiple_changepoint_detection_and_Bayesian_model_selection

In the process of running the analysis for multiple datasets I observed that the memory used increases with each successive analysis, by about 20 MB per run in my code. I confirmed that this occurs with the original tutorial notebook as well to ensure that there is not an error specific to my code. The easiest way to reproduce is to download the notebook, execute all cells, then rerun from the "Unknown number of states" section. I have tried adding tf_keras.backend.clear_session() between each run, deleting all created objects before rerunning, and running garbage collection, but nothing has released the ~20MB of memory that is allocated with each tfp.math.minimize run.

melanierbutler avatar Mar 26 '25 14:03 melanierbutler