qonnx icon indicating copy to clipboard operation
qonnx copied to clipboard

Changed behavior after BatchNormToAffine transformation

Open auphelia opened this issue 1 year ago • 0 comments

Quick summary

A standalone BatchNormalization node with certain settings (see .onnx file in .zip folder: bn_model.zip) changes its functional behavior, when transformed with the BatchNormToAffine transformation.

Steps to Reproduce

  1. The issue was observed when using the FINN docker container, but with the current main branch of qonnx (commit hash: 12c96a3ded06beacab08e0f554e4ed014476c0aa).
  2. Run transformation BatchNormToAffine on ONNX file.
  3. Execute model before and after the transformation with random floating point input (x = gen_finn_dt_tensor(DataType["FLOAT32"], (1, 64, 64, 64)) inp_dict = {"global_in": x})
  4. Compare execution of the model before and after the transformation.

Expected behavior

The outputs before and after do not match.

Actual behavior

The functional behavior should not change due to the transformation.

Possible fix

It seems to be a rounding error, coming from this calculation: A = scale / np.sqrt(epsilon + variance)

auphelia avatar Mar 29 '23 15:03 auphelia