ctcdecode
ctcdecode copied to clipboard
CPU RAM memory leak when using the beam search decoder
Hi,
I used the ctc beam decoder from this link https://github.com/joshemorris/pytorch-ctc. However, I found that after I finished decoding one utterance, the decoder does not release RAM memory. After decoding more and more sentences, the RAM was full. This is especially true if I use a large beam width such as 100, in which case RAM usage quickly blow up.
My code looks like this:
import pytorch_ctc from pytorch_ctc import Scorer
decoder = pytorch_ctc.CTCBeamDecoder(Scorer(), labels, top_paths = 1, beam_width = 100, blank_index = 0, space_index = -1, merge_repeated=False)
for i in range(total_num_utterances): decoded, _, out_seq_len = decoder.decode(prob_tensor_i, seq_len_i)
Anyone has any ideas how to fix this issue?
Thank you very much.
Are you creating a new scorer each time per chance? We don't do a whole lot of dynamic memory allocation.