open_model_zoo icon indicating copy to clipboard operation
open_model_zoo copied to clipboard

Memory Leak observed in text_to_speech model demo sample

Open AishaSamaanKhan opened this issue 3 years ago • 0 comments

Summary

We were trying to leverage the openvino text to speech demo sample as inferencing server to which input is fed continuously, we observed that with each input text the memory consumption increases and eventually causing segment fault which results in machine crash. After debugging we found that the scope of inference network object still remain and thereby become a garbage object.

Question

Is there a optimized way to clear the inference network before passing the next input so that it can fit our use-case?

Simple way to reproduce it

Giving the continuous input in for-loop

AishaSamaanKhan avatar May 20 '22 06:05 AishaSamaanKhan