onnx
onnx copied to clipboard
ONNX Windows build relies on MS Store python, fails with official python launcher
Bug Report
Description
I'm using a Windows Arm-based PC, and the protobuf compilation stage of the pip installation fails with a fatal error. The error indicates that the mypy plugin is failing because the python command isn't found. I believe this is because I'm using the official download from Python dot org that only installs the py executable, which is a launcher for Python. This is listed as the recommended option at docs.python.org/3/using/windows.html, with the Microsoft Store version of python listed as an alternative.
Unfortunately the Store version of Python is still x86-based, so I can't use it on a Windows/Arm PC (technically it can run under emulation but having both installed causes other problems). This also seems like a common setup in the Windows world, regardless of architecture, so it would be nice to support it.
System information
- Windows 11, ARM
- ONNX version 1.18
- Python version: 3.11 (Arm)
Reproduction instructions
git clone https://github.com/onnx/onnx
cd onnx
py setup.py install
This also currently happens with plain pip install onnx.
Expected behavior
I expect the wheel to be built and installed.
Actual behavior
The build fails with the following error messages (trimmed to the most relevant):
Building Custom Rule C:/Users/pete/onnx/CMakeLists.txt
Running C++ protocol buffer compiler on C:/Users/pete/onnx/.setuptools-cmake-build/onnx/onn
x-ml.proto
Python was not found; run without arguments to install from the Microsoft Store, or disable
this shortcut from Settings > Manage App Execution Aliases.
--mypy_out: protoc-gen-mypy: Plugin failed with status code 9009.
C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.C
ppCommon.targets(254,5): error MSB8066: Custom build for 'C:\Users\pete\onnx\.setuptools-cmak
e-build\CMakeFiles\1cc123d7a88b38a07b5d702bea4bbab7\onnx-ml.proto.rule;C:\Users\pete\onnx\.se
tuptools-cmake-build\CMakeFiles\1cc123d7a88b38a07b5d702bea4bbab7\onnx-ml.pb.cc.rule;C:\Users\
pete\onnx\.setuptools-cmake-build\CMakeFiles\2613932fd8912cf9addf99599c963206\gen_onnx_proto.
rule;C:\Users\pete\onnx\CMakeLists.txt' exited with code 1. [C:\Users\pete\onnx\.setuptools-c
make-build\gen_onnx_proto.vcxproj]
Traceback (most recent call last):
...
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['C:\\Program Files\\CMake\\bin\\cmake.EXE', '--build', '.', '--config', 'Release', '--', '/maxcpucount:12']' returned non-zero exit status 1.
Notes
From my debugging, this is caused by the tools/protoc-gen-mypy.bat batch file calling the python command directly. I will be submitting a patch to try both versions of the command, so that:
python -u "%~dp0\protoc-gen-mypy.py"
becomes
python -u "%~dp0\protoc-gen-mypy.py" || py -u "%~dp0\protoc-gen-mypy.py"
I will submit this as a PR linking back to this issue shortly.
I think the solution should be https://github.com/onnx/onnx/pull/6096. Could you help push that forward?
Unfortunately I'm a first-time contributor to ONNX, and #6096 looks like a fairly complex change, so I can't take that on right now. Feel free to close this if you do see that as the correct solution, at least this leaves a trail of breadcrumbs for anyone else who hits the same error.
I had something similar, maybe not the same (failed with status code 9009). He writes:
>>>Python was not found<<<; run without arguments to install from the Microsoft Store, or disable
this shortcut from Settings > Manage App Execution Aliases.
--mypy_out: protoc-gen-mypy: Plugin failed with status code 9009.
- This happens if you have embedded python and the paths are not specified (Or they lead to the wrong python). Solution, write correct path:
set PATH=H:\ComfyUI128\python_embeded;H:\ComfyUI128\python_embeded\Scripts;%PATH%
set PY_LIBS=H:\ComfyUI128\python_embeded\Scripts\Lib;H:\ComfyUI128\python_embeded\Scripts\Lib\site-packages
set PY_PIP=H:\ComfyUI128\python_embeded\Scripts
- Then another error appeared. Сomplained about the long paths in Windows. It didn't help:
set TMP=C:\tmp
set TEMP=C:\tmp
A magic command no-build-isolation helped, the meaning of which I still don’t understand:
Solution:
pip install onnx --no-cache-dir --no-build-isolation
Result in python 3.13:
Successfully built onnx
Installing collected packages: onnx
Successfully installed onnx-1.17.0
But not all test passed:
Details
FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_averagepool_3d_dilations_large_count_include_pad_is_1_ceil_mode_is_True_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_axis_cpu - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_axis_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_axis_opset19_cpu - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_axis_opset19_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_cpu - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_inverse_cpu - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_inverse_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_inverse_opset19_cpu - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_inverse_opset19_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_opset19_cpu - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_dft_opset19_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_roialign_aligned_false_cuda - RuntimeError: Unable to create inference session. Model is: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_roialign_aligned_true_cuda - RuntimeError: Unable to create inference session. Model is: FAILED test_backend_onnxruntime.py::OnnxBackendNodeModelTest::test_roialign_mode_max_cuda - RuntimeError: Unable to create inference session. Model is: FAILED test_backend_onnxruntime.py::OnnxBackendPyTorchConvertedModelTest::test_Conv3d_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendPyTorchConvertedModelTest::test_Conv3d_dilated_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendPyTorchConvertedModelTest::test_Conv3d_dilated_strided_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendPyTorchConvertedModelTest::test_Conv3d_no_bias_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendPyTorchConvertedModelTest::test_Conv3d_stride_cuda - AssertionError: FAILED test_backend_onnxruntime.py::OnnxBackendPyTorchConvertedModelTest::test_Conv3d_stride_padding_cuda - AssertionError: =================================================== 22 failed, 6588 passed, 3878 skipped, 27 warnings in 80.39s (0:01:20)
Details
--------------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------------- 2025-04-04 01:06:23.4333584 [W:onnxruntime:, model.cc:214 onnxruntime::Model::Model] ONNX Runtime only guarantees support for models stamped with opset version 7 or above for opset domain 'ai.onnx'. Please upgrade your model to opset 7 or higher. For now, this opset 6 model may run depending upon legacy support of some older opset version operators. 2025-04-04 01:06:23.4341017 [W:onnxruntime:, transpose_optimizer.cc:37 onnxruntime::TransposeOptimizer::ApplyImpl] Transpose optimizer failed: Unsupported ONNX opset: 6 2025-04-04 01:06:23.4343513 [W:onnxruntime:, transpose_optimizer.cc:37 onnxruntime::TransposeOptimizer::ApplyImpl] Transpose optimizer failed: Unsupported ONNX opset: 6
___________________________________________________ OnnxBackendPyTorchConvertedModelTest.test_Conv3d_stride_padding_cuda ____________________________________________________
It's not clear what he wants.
Got this working, build time was around 30 minutes on Windows 11 Pro, on an Azure Standard E8ps v6 (8 vcpus, 64 GiB memory) VM:
(.venv) C:\Users\waheedbrown\workspaces\git\onnxruntime>.\build.bat --config RelWithDebInfo --build_shared_lib --parallel --compile_no_warning_as_error --skip_submodule_sync
...
8: [ OK ] RealCAPITestsFixture.CppApiORTCXXLOGF (0 ms)
8: [----------] 4 tests from RealCAPITestsFixture (1 ms total)
8:
8: [----------] 1 test from MockCAPITestsFixture
8: [ RUN ] MockCAPITestsFixture.CppLogMacroBypassCApiCall
8: [ OK ] MockCAPITestsFixture.CppLogMacroBypassCApiCall (0 ms)
8: [----------] 1 test from MockCAPITestsFixture (0 ms total)
8:
8: [----------] Global test environment tear-down
8: [==========] 5 tests from 2 test suites ran. (1 ms total)
8: [ PASSED ] 5 tests.
8/8 Test #8: onnxruntime_logging_apis_test ........... Passed 0.67 sec
100% tests passed, 0 tests failed out of 8
Total Test time (real) = 67.10 sec
2025-04-28 16:46:27,459 build [INFO] - Build complete
(.venv) C:\Users\waheedbrown\workspaces\git\onnxruntime>python --version
Python 3.10.1
- Run
ARM64 Native Tools Command Prompt for VS 2022AS ADMINISTRATOR - Download Python 3.10.1 Windows embeddable package (ARM64)
- Install get-pip.py, if you have issues with pip not working with embeddable Python
- Do the above environment variables step:
set PATH=C:\Users\waheedbrown\Downloads\python-3.10.1-embed-arm64;C:\Users\waheedbrown\Downloads\python-3.10.1-embed-arm64\Scripts;%PATH% set PY_LIBS=C:\Users\waheedbrown\Downloads\python-3.10.1-embed-arm64\Lib;C:\Users\waheedbrown\Downloads\python-3.10.1-embed-arm64\Lib\site-packages set PY_PIP=C:\Users\waheedbrown\Downloads\python-3.10.1-embed-arm64\Scripts mkdir C:\tmp set TMP=C:\tmp set TEMP=C:\tmp - Manually allow pip to find packages by editing file
\...\python-3.10.1-embed-arm64\python310._pth: Change:
to#import siteimport site - Create a virtualenv environment for doing your build
pip install virtualenv python -m virtualenv .venv .venv\Scripts\activate- NOTE: I suspect this is a crucial step
- Build (for inferencing) following the official instructions for Windows:
.\build.bat --config RelWithDebInfo --build_shared_lib --parallel --compile_no_warning_as_error --skip_submodule_sync