onnxruntime
onnxruntime copied to clipboard
Converting model with olive and onnxruntime 1.17 produces OnnxRuntimeException in a C# Windows app
Describe the issue
Using this guide, when converting the whisper model with onnxruntime 1.17 installed, I'm unable to run the converted model in this sample as intended. I get the following OnnxRuntimeException:
However, if I convert the whisper model with onnxruntime 1.16 installed (and nothing else changed), the sample works as intended.
To reproduce
-
Use this guide to convert and optimize the whisper model from hugging face.
- I used the whisper_cpu_int8.json config
- Copy and paste the converted onnx model in this sample and run the sample as described in the README
- Optional: update the nuget packages to latest version of onnxruntime (1.17).
- Run the samples and try to transcribe and notice the OnnxRuntimeException
By following these steps, the latest version of onnxruntime is installed, version 1.17. If you instead install onnxruntime version 1.16 (pip install onnxruntime==1.16
) before doing the conversion and optimization, the model will work just fine in the sample.
Urgency
No response
Platform
Windows
OS Version
22h2
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.17
ONNX Runtime API
C#
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response
Thank you for raising this issue. We think we have identified the source of the problem, which will be fixed in the upcoming 1.17.1 patch release next week
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
This doesn't seem to be a converter error, but a runtime one