optimizer icon indicating copy to clipboard operation
optimizer copied to clipboard

If model use initializer, fuse_bn_into_conv is broken from v0.3.9

Open sa-kei728 opened this issue 2 years ago • 5 comments

I tried to optimize model by using "fuse_bn_into_conv", generate optimized model incorrectly from v0.3.9. I would say 807cff7 is the cause. this commit would work fine with constant node, but the initializer shouldn't work... Why were these changes made? I'd like to support for initializer as well since onnxoptimizer has "extract_constant_to_initializer" which I often use to avoid errors related to topological sort. At least I wouldn't like this change.

Confirmed Environment(part of pyproject.toml)

[tool.poetry.dependencies]
python = "3.11"
onnx = "1.13.1"
onnxoptimizer = "0.3.9"

Reproduce Code

  1. Download tiny-yolov3.onnx (this model is used initializer) https://github.com/onnx/models/blob/main/vision/object_detection_segmentation/tiny-yolov3/model/tiny-yolov3-11.onnx

  2. prepare onnx and onnxoptimizer

  3. create reproduce code

import onnx
from onnxoptimizer import optimize

# load model
model_path = "tiny-yolov3-11.onnx"
model = onnx.load(model_path)

# setting optimize passes
optimization_passes = ["fuse_bn_into_conv"]

# execute optimization
optimized_model = optimize(model, optimization_passes)

# save model
optimized_model_path = "tiny-yolov3-11-optimized.onnx"
onnx.save(optimized_model, optimized_model_path)

v0.3.9

v0_3_9

in case of v0.3.8

v0_3_8

sa-kei728 avatar Apr 18 '23 10:04 sa-kei728

Thanks for your issue! We'll consider how to fix it. Could you please try using onnxsim for now?

daquexian avatar Apr 19 '23 08:04 daquexian

Thank you for quickly response! I tried onnxsim for original and optimized model. The correct fusion model is now created!

sa-kei728 avatar Apr 22 '23 01:04 sa-kei728

I found another case. If Conv has "bias" term, It seems that fuse_bn_into_conv fails from v0.3.9. This case is unresolved by using onnxsim.

In general there is no such model, but the tool I was using seems to generate an onnx model that temporarily includes the bias term in the conv and then runs fuse_bn_into_conv, which fails in the optimizer after v0.3.9. (I'd like to share this model and tool with you, but I cannot do so for some reasons. Sorry...)

There is a work-around for this case, it's no problem for now. I comment to share the information.

sa-kei728 avatar Apr 22 '23 03:04 sa-kei728

@sa-kei728 What is the workaround you used?

AndreyOrb avatar Jan 13 '24 19:01 AndreyOrb

@AndreyOrb In my case, I used v0.3.8.

I would say onnx2json is also resolved this issue. You may resolve to remove bias term in the target Conv Node and append Add Node after Conv. (However I haven't tried it, so can't guarantee it works.)

sa-kei728 avatar Jan 21 '24 14:01 sa-kei728