FunASR
FunASR copied to clipboard
Fp16 inference, forward got NaN
Notice: In order to resolve issues more efficiently, please raise issue following the template. (注意:为了更加高效率解决您遇到的问题,请按照模板提问,补充细节)
🐛 Bug
使用官方示例代码推理,当使用Fp16的时候,前向出现Nan
To Reproduce
Steps to reproduce the behavior (always include the command you ran):
推理代码: `from funasr import AutoModel import torch
with torch.cuda.amp.autocast(): model = AutoModel(model="paraformer-zh" # spk_model="cam++", ) res = model.generate(input=f"test.wav", batch_size_s=300, hotword='魔搭') print(res)`
前向推理在sanm/encoder.py的forward会出现Nan,如果使用Fp32则无问题
Code sample
Expected behavior
Environment
- OS (e.g., Linux):
- FunASR Version (e.g., 1.0.0):
- ModelScope Version (e.g., 1.11.0):
- PyTorch Version (e.g., 2.0.0):
- How you installed funasr (
pip
, source): - Python version:
- GPU (e.g., V100M32)
- CUDA/cuDNN version (e.g., cuda11.7):
- Docker version (e.g., funasr-runtime-sdk-cpu-0.4.1)
- Any other relevant information: