MagicSource

Results 1314 comments of MagicSource

``` self.model = RnntLstmModel( nn_encoder_filename, nn_decoder_filename, nn_joiner_filename, device=device, ) ``` can't even nkonw where does it send token file

@csukuangfj Hi, this code: https://github.com/jinfagang/aural/blob/master/demo_file_ncnn.py Essentially same as yours. But I removed encoder_giga, just librispeech. The weights also using librispeech only.

@csukuangfj the inference code is copired from your ncnn-decode.py without any changing. Yes, am using your modificied ncnn, otherwise it can not load ncnn model at all.

the model am using lstm-stateless, then convert to ncnn

Like: ``` ./src/pnnx /Users//weights/joiner_jit_trace-pnnx.pt ``` all without any params

pt file is export from icefall

@csukuangfj Yes, I just copied the command line and send coresponding pth file to export.

@csukuangfj ``` import argparse import logging from pathlib import Path import sentencepiece as spm import torch import torch.nn as nn from aural.utils.scaling_converter import convert_scaled_to_non_scaled from train import add_model_arguments, get_params, get_transducer_model...

the code is same as icefall, didn't change anything.