MagicSource

Results 1309 comments of MagicSource

I got all 0 in argmax, is that normal? ``` [[encoder_out]: torch.Size([1, 119, 512]) cpu torch.float32 [decoder_input]: torch.Size([1, 2]) cpu torch.int32 [decoder_out]: torch.Size([1, 512]) cpu torch.float32 [encoder_out]: torch.Size([1, 119, 512])...

I have got right result now. May I ask when will lstm transducer code will open source in icefall?

@csukuangfj Hi, I mean wenet Chinese model. I might also need bpe model responding.

@csukuangfj Hoping for it. BTW, does this func `def greedy_search_single_batch( model, encoder_out: torch.Tensor, max_sym_per_frame: int ) -> List[int]:` Have a suggested value for max_sym_epr_frame?

@csukuangfj thank u. BTW, how does these model exported separately? I only saw the logic is exportying the whole model rather than 3 parts. the export jit here is save...

@csukuangfj thank u. I now can sucessfully export. However, I want further export to onnx for onnxruntime inference, but I got a errror: ``` raise symbolic_registry.UnsupportedOperatorError( torch.onnx.symbolic_registry.UnsupportedOperatorError: Exporting the operator...

@csukuangfj Does sherpa support export to PNNX?

@csukuangfj Hi, how does the performance drop compare with LSTM transducer and transformer arch?

Does any one of these support export to pnnx? Just tested default model perform very well.

@csukuangfj Hi, may I ask why does LSTM only do well on streaming task, not other tasks? Why can't conformer beat LSTM on this task