trangtv57
trangtv57
I adding log from file log/compute_sentence_scores.1.log Note: I have change version [compute_sentence_scores.py](https://github.com/kaldi-asr/kaldi/blob/master/egs/wsj/s5/steps/pytorchnn/compute_sentence_scores.py) of orgin version to version with compute score by cuda. Some addtional info I print after line https://github.com/kaldi-asr/kaldi/blob/12a2092c887c49fce04360dbf48e43067992e770/egs/wsj/s5/steps/pytorchnn/compute_sentence_scores.py#L192...
I don't think so, I give you a tensor data of 1 sample: ``` _data shape: torch.Size([3, 2]) data value: tensor([[30893, 30893], [18171, 27414], [27414, 0]]) target shape: torch.Size([6]) target...
tks @danpovey , with option 1 I can do it, but as best solution in my mind. I think I need limit the num-paths at the path generation stage. But...
in my issuse #4719, I remove The assertion: KALDI_ASSERT(clat.NumStates() > 1); as you suggest and it work. I understand your idea. I will try some fix and report later. I...
hi @danpovey I have add some code for limit size of paths_and_costs in function ComputePathCover ( https://github.com/kaldi-asr/kaldi/blob/aefbd096ec0c7f1136f669c99be66ac393afe29c/src/latbin/lattice-path-cover.cc#L174) and the problem with size of arcpath has resolve. But, I have another...
so, I don't really get where can i start to debug. I run rescore with n-best all is ok? . I will try remove assert to get error if have....
I attack the error message here: @danpovey 
but, as i said. My experiment before i have code fix limit numper of path in lattice-path-cover. The experiment with lattice rescoring still exist error. Anyway, I am not sure...
yep, thank you. So can we discuss again about idea lattice limit depth, I understand that I need change some code in lattice-expand. But I don't think this is root...
thank you. I will try my self.