onnx-simplifier icon indicating copy to clipboard operation
onnx-simplifier copied to clipboard

Python 3.12 wheel

Open neosr-project opened this issue 1 year ago • 6 comments

Hi. It looks like onnxsim from PyPI does not install when using python 3.12. Is there any plans to support >3.11?

neosr-project avatar Jun 07 '24 20:06 neosr-project

+1

ogencoglu avatar Aug 15 '24 16:08 ogencoglu

+1

umarbutler avatar Aug 25 '24 09:08 umarbutler

I would recommend checking out OnnxSlim. I was able to install it on Python 3.12 without any issues.

umarbutler avatar Aug 25 '24 10:08 umarbutler

I successfully installed onnxsim==0.4.36 in python3.12.7 pip install onnxsim==0.4.36

Puiching-Memory avatar Nov 05 '24 13:11 Puiching-Memory

This is probably an issue with older cmake version:

      CMake Error at CMakeLists.txt:1 (cmake_minimum_required):
        CMake 3.22 or higher is required.  You are running version 3.18.4

It does install for me with a newer cmake.

ryanli avatar Feb 03 '25 12:02 ryanli

+1 to having pre-built wheels instead of having to build from source which requires cmake

kevalmorabia97 avatar Mar 14 '25 05:03 kevalmorabia97

I'm running on Python 3.13.2, and after I updated my cmake to the newest 3.31.6, I installed onnxsim-0.4.36 successfully. I solved problem using this method, @ryanli thank you!

This is probably an issue with older cmake version:

      CMake Error at CMakeLists.txt:1 (cmake_minimum_required):
        CMake 3.22 or higher is required.  You are running version 3.18.4

It does install for me with a newer cmake.

xiaoniaogangsi avatar Mar 23 '25 16:03 xiaoniaogangsi

onnxsim with python 3.12 seems to require cmake<4.0.

cc @daquexian will you release a wheel for python 3.12?

fxmarty-amd avatar May 22 '25 10:05 fxmarty-amd

On python 3.12 and ubuntu 24.04 - This did it for me CMAKE_ARGS="-DCMAKE_POLICY_VERSION_MINIMUM=3.5" pip install --use-pep517 onnxsim

iongion avatar Jun 05 '25 15:06 iongion

Not exactly what everyone will want, but if you're comfortable with using containers you can build this:

FROM fedora:42
# NOTE: Python 3.13 with CMake 3.31.6
RUN dnf install -yq pipx python3-devel cmake clang && dnf clean all
RUN pipx install onnxsim --preinstall onnxruntime

WORKDIR /models
ENTRYPOINT ["/root/.local/bin/onnxsim"]
# Took about 5 minutes to build for me (1.15GB image size):
docker build --tag localhost/onnxsim .

# Run the container with local models directory containing `original.onnx`,
# After processing you'll find a `simplified.onnx` file there too
docker run --rm -it --volume ./models:/models/:Z localhost/onnxsim original.onnx simplified.onnx

The Python used there is 3.13, but it's also worked well for Python 3.12. Bit different if installing from the git repo I think, but for most users that just want a CLI tool to process the model this approach works well.


Reducing image size from 1.15GB => 50MB

If the 1.15GB image size is a concern, you can switch to a multi-stage image. Build and run commands are the same but image size can be as low as ~50MB.

Fedora (152MB)
# syntax=docker/dockerfile:1

FROM fedora:42 as builder
RUN dnf install -yq pipx python-devel cmake clang && dnf clean all
# Install `onnx-simplifier`:
RUN pipx install onnxsim --preinstall onnxruntime
# Package into a single executable via PyInstaller (43MB):
WORKDIR /opt/pyinstaller
RUN <<HEREDOC
  pipx inject onnxsim pyinstaller
  source /root/.local/share/pipx/venvs/onnxsim/bin/activate
  pyinstaller --onefile /root/.local/bin/onnxsim
HEREDOC

# Create a minimal Fedora base image:
# (109MB, or 187MB if adding `python` - but that is redundant when using PyInstaller)
FROM builder AS root-fs
RUN dnf --installroot /root-fs --use-host-config --setopt=install_weak_deps=0 install -yq \
  libstdc++ zlib

# Runtime image (152MB):
FROM scratch
COPY --link --from=root-fs /root-fs /
COPY --link --from=builder /opt/pyinstaller/dist/onnxsim /usr/local/bin/onnxsim
WORKDIR /models
ENTRYPOINT ["onnxsim"]
Ubuntu (51MB)
# syntax=docker/dockerfile:1

FROM fedora:42 AS builder
# This example shows how to target a specific version of Python:
# (important if needing to match the version for a runtime python image)
RUN dnf install -yq python3.12-devel cmake clang which && dnf clean all
# Install `onnx-simplifier`, then package into a single executable via PyInstaller (43MB):
WORKDIR /opt/pyinstaller
RUN <<HEREDOC
  python3.12 -m venv env
  source env/bin/activate
  python -m pip install onnxsim onnxruntime pyinstaller
  pyinstaller --onefile $(which onnxsim)
HEREDOC

# Custom chisel image until an official one is released:
FROM alpine AS chisel
ARG CHISEL_VERSION=1.1.0
ARG TARGETARCH
# NOTE: `--no-same-owner` used as `chisel` release has ownership of `1001:128`
RUN <<HEREDOC
  CHISEL_RELEASE="https://github.com/canonical/chisel/releases/download/v${CHISEL_VERSION}/chisel_v${CHISEL_VERSION}_linux_${TARGETARCH}.tar.gz"
  wget -qO - "${CHISEL_RELEASE}" | tar -xz --no-same-owner -C /usr/local/bin chisel
HEREDOC

# Use Canonical's Chisel tool to create a minimal Ubuntu base image (8MB /root-fs):
FROM chisel AS root-fs
WORKDIR /root-fs
RUN chisel cut --release ubuntu-24.04 --root /root-fs base-files_base libc6_libs libstdc++6_libs zlib1g_libs

# Runtime image (51MB):
FROM scratch
COPY --link --from=root-fs /root-fs /
COPY --link --from=builder /opt/pyinstaller/dist/onnxsim /usr/local/bin/onnxsim
WORKDIR /models
ENTRYPOINT ["onnxsim"]

Base image selection notes:

  • ubuntu/python:3.12-24.04 lacked libstdc++, while bundling Python via PyInstaller was incompatible with gcr.io/distroless/cc-debian12 which lacked libz.
  • Chisel lacks an official image atm, so it's a bit more verbose of a stage to prepare the minimal base image. Chisel is much more granular / optimal for base images, while Fedora is very simple/straight-forward with some added base weight as the drawback.

PyInstaller for reduced size

There might be some better alternatives to PyInstaller, but it was a fairly simple way to bring the size down. Inspecting the output of PyInstaller --onedir, it's effectively all shared libs and CPython, any waste from Python's venv site-packages is gone (tests, docs, etc), 70MB alone is .pyc + .pyi files.

PyInstaller does bundle both libstdc++ and libz libraries, so those technically aren't required if you don't use --onefile, but use --onedir instead.

  • However you'd then need to run with an ENV like LD_LIBRARY_PATH=/opt/dist/onnxsim/_internal and run the executable at /opt/dist/onnxsim/onxxsim, this also skips compression which raises the on-disk size from 43MB to 109MB (101MB for _internal/ + 8MB for the onnxsim binary). The overhead at runtime from decompression is minimal (1s vs 2s).
  • Doing so only enables reducing the Chisel image base size from 8MB down to 5MB. Seemed better to keep it simple with --onefile.

The only other concern for portability there might be glibc from the builder image potentially being too new for a different runtime base image like Chisel at some point. If that happens just build on an older Fedora release instead.

polarathene avatar Jun 10 '25 23:06 polarathene

Unfortunately, the project has been left untouched for just over a year. At present, there are no pre-built wheels for Python 3.12 or 3.13, so installing it—especially inside Docker—requires a build toolchain and a fair bit of compile time. (related: https://github.com/daquexian/onnx-simplifier/pull/353 , https://github.com/daquexian/onnx-simplifier/pull/359) Pre-built wheels for Linux aarch64 have never been provided, either. (related: https://github.com/daquexian/onnx-simplifier/pull/328)

Because I was tired of waiting for a full compile each time I installed the dependencies, I published a fork called onnxsim-prebuilt on PyPI (https://pypi.org/project/onnxsim-prebuilt/). GitHub: https://github.com/tsukumijima/onnx-simplifier-prebuilt

Although it began as a personal solution, I’ve uploaded wheels for Python 3.10, 3.11, 3.12, and 3.13 covering Windows x64, macOS universal2, and Linux x86_64 / aarch64. Specifically, after incorporating the suggestions in https://github.com/daquexian/onnx-simplifier/pull/359, we fixed more CI errors, added a Linux aarch64 build, and improved the tests to pass on all versions except macOS x64.

Until the upstream project is updated, you might find this package useful. Aside from updating the CI scripts and bumping the onnx-optimizer and onnxruntime dependencies, it is identical to the original repository.

tsukumijima avatar Jul 02 '25 12:07 tsukumijima

+1, I need it……

FengJungle avatar Jul 05 '25 02:07 FengJungle

Unfortunately, the project has been left untouched for just over a year. At present, there are no pre-built wheels for Python 3.12 or 3.13, so installing it—especially inside Docker—requires a build toolchain and a fair bit of compile time. (related: #353 , #359) Pre-built wheels for Linux aarch64 have never been provided, either. (related: #328)

Because I was tired of waiting for a full compile each time I installed the dependencies, I published a fork called onnxsim-prebuilt on PyPI (https://pypi.org/project/onnxsim-prebuilt/). GitHub: https://github.com/tsukumijima/onnx-simplifier-prebuilt

Although it began as a personal solution, I’ve uploaded wheels for Python 3.10, 3.11, 3.12, and 3.13 covering Windows x64, macOS universal2, and Linux x86_64 / aarch64. Specifically, after incorporating the suggestions in #359, we fixed more CI errors, added a Linux aarch64 build, and improved the tests to pass on all versions except macOS x64.

Until the upstream project is updated, you might find this package useful. Aside from updating the CI scripts and bumping the onnx-optimizer and onnxruntime dependencies, it is identical to the original repository.

Indeed I have unfortunately no time to maintain onnxsim actively. Thanks a lot for your contribution!

Maybe we can cooperate to make this prebuilt wheels available by pip install onnxsim. May I have your e-mail address so that we can communicate seamlessly?

daquexian avatar Jul 05 '25 02:07 daquexian

@daquexian I honestly didn’t expect to hear from you. thank you.

My email address is tsukumizima<at mark>gmail.com.

My changes are mostly limited to CI adjustments and dependency updates (based on improvements made in other pull requests), and I’ve hardly touched the core code. In fact, I don’t have a full understanding of the onnxsim codebase. I only made some CI fixes. so I’m not in a position to take over long-term maintenance.

That said, as long as the current codebase continues to work, I think I can help with keeping it compatible with new Python releases and similar tasks.

I can also create a pull request to your repository based on the changes I made in onnx-simplifier-prebuilt. What would be the best way to proceed?

tsukumijima avatar Jul 05 '25 12:07 tsukumijima

@daquexian @tsukumijima any updates on getting the changes merged to current repository?

kevalmorabia97 avatar Aug 04 '25 08:08 kevalmorabia97

@kevalmorabia97 Unfortunately, I have not received any contact from @daquexian at this time, and there has been no particular progress. (It is possible that the email did not reach me for some reason.)

tsukumijima avatar Aug 04 '25 08:08 tsukumijima

@kevalmorabia97 Unfortunately, I have not received any contact from @daquexian at this time, and there has been no particular progress. (It is possible that the email did not reach me for some reason.)

My apologies, that was my fault. I have been very busy and overlooked sending the email. I have just sent it now, so please check your inbox. I'm sorry again for the delay.

daquexian avatar Aug 04 '25 08:08 daquexian

+1

thiagocrepaldi avatar Aug 14 '25 18:08 thiagocrepaldi

Why not give onnxslim a try? onnxslim is a fully python-based onnx optimizer that requires no compilation, making it easier to debug. It has already been adopted by repositories such as Ultralytics, Transformers, and Optimum. @tsukumijima @kevalmorabia97 @thiagocrepaldi @inisis

initialencounter avatar Aug 15 '25 03:08 initialencounter

I was able to install onnxsim==0.4.36 with python 3.12 like this:

python3 -m pip install --no-cache-dir "cmake<4.0" && python3 -m pip install --no-cache-dir "onnxsim==0.4.36"

koolvn avatar Oct 13 '25 16:10 koolvn

谢谢你!  Thank you!

initialencounter avatar Oct 13 '25 16:10 initialencounter

Build fails because of a low CMake version in third_party/onnx-optimizer/third_party/onnx/CMakeLists.txt

Need to bump onnx-optimizer to get a newer onnx:

https://github.com/onnx/optimizer/pull/201

develOseven avatar Oct 18 '25 19:10 develOseven

@daquexian do you think you can provide someone else write access to this repository so we can enable Python 3.12+ wheel build and publishing from CICD? It should be a fairly simple fix that needs to be merged then create a new release which will build pre-compiled 3.12+ wheels.

kevalmorabia97 avatar Oct 20 '25 17:10 kevalmorabia97

Actually, I have already received write permission to the repository from the maintainer, but I have left it alone because I have already been able to get by with onnx-simplifier-prebuilt that I created myself. I'll try to update it if I have time.

tsukumijima avatar Oct 20 '25 22:10 tsukumijima