onnxruntime
onnxruntime copied to clipboard
DnnlExecutionProvider is not visible in python API
Describe the bug
DnnlExecutionProvider is not visible in python API after compiling onnxruntime with it.
Creating Inference Session with it causes error DnnlExecutionProvider' is not in available provide
System information
- OS Platform and Distribution: Ubuntu 21.04
- ONNX Runtime installed from (source or binary): Source
- ONNX Runtime version: 1.10.0
- Python version: 3.8.8
To Reproduce
./build.sh --use_dnnl --build_wheel --build_shared_lib --config Release
cd build/Linux/Release/dist/
pip instsall onnxruntime_dnnl-1.11.0-cp38-cp38-linux_x86_64.whl
>>> import onnxruntime as rt
>>> rt.get_available_providers()
['CPUExecutionProvider']
When trying to create a model
s=rt.InferenceSession("model.onnx", providers=["DnnlExecutionProvider"])
s.get_providers()
UserWarning: Specified provider 'DnnlExecutionProvider' is not in available provider names.Available providers: 'CPUExecutionProvider'
warnings.warn("Specified provider '{}' is not in available provider names."
['CPUExecutionProvider']
Expected behavior ['CPUExecutionProvider',"DnnlExecutionProvider"]
i have the same problem, did you solve? Ths
Same problem here
Same problem
Same
same
Has anyone solved this issue?
@magicaltoast did you manage to resolve the issue?