FunASR
FunASR copied to clipboard
怎么用ONNX GPU推理
Notice: In order to resolve issues more efficiently, please raise issue following the template. (注意:为了更加高效率解决您遇到的问题,请按照模板提问,补充细节)
❓ Questions and Help
我的系统有GPU,也安装了 onnx_gpu, 但是全部使用的是CPU,我需要用GPU onnx 1.16.1 onnxruntime 1.19.0 onnxruntime-gpu 1.18.1
model_dir = "P:/weight/funasr/hub/iic/SenseVoiceSmall"
model = SenseVoiceSmall(model_dir, device_id=0,batch_size=10, quantize=False)
inference
wav_or_scp = [f"E:/wsldata/code/FunASR/examples/output_000.wav"] res = model(wav_or_scp, language="auto", use_itn=True) print([rich_transcription_postprocess(i) for i in res])
有警告输出: D:\ProgramData\anaconda3\envs\dataset\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py:69: UserWarning: Specified provider 'CUDAExecutionProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider' warnings.warn( E:\wsldata\code\FunASR\runtime\python\onnxruntime\funasr_onnx\utils\utils.py:221: RuntimeWarning: CUDAExecutionProvider is not avaiable for current env, the inference part is automatically shifted to be executed under CPUExecutionProvider. Please ensure the installed onnxruntime-gpu version matches your cuda and cudnn version, you can check their relations from the offical web site: https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html warnings.warn(
跟踪代码发现 def _create_inference_session(self, providers, provider_options, disabled_optimizers=None): available_providers = C.get_available_providers() available_providers 可用的Provider ['AzureExecutionProvider', 'CPUExecutionProvider']
Before asking:
- search the issues.
- search the docs.
What is your question?
Code
What have you tried?
What's your environment?
- OS (e.g., Linux):
- FunASR Version (e.g., 1.0.0):
- ModelScope Version (e.g., 1.11.0):
- PyTorch Version (e.g., 2.0.0):
- How you installed funasr (
pip, source): - Python version:
- GPU (e.g., V100M32)
- CUDA/cuDNN version (e.g., cuda11.7):
- Docker version (e.g., funasr-runtime-sdk-cpu-0.4.1)
- Any other relevant information: