aimet icon indicating copy to clipboard operation
aimet copied to clipboard

Operator combination of model structures

Open zhihaofan opened this issue 8 months ago • 0 comments

Hello, I am using AIMET for QAT, but when I use fold_all_batch_norms, I find that there is a lot of loss before and after the fold. I tried using . /Examples/torch/quantization/qat.ipynb when I tried it, I found that fold_all_batch_norms also had differences, but the results were acceptable.

Both models are constructed with connv , BatchNorm, ReLU, and residual networks. I would like to ask if there is a guide to use the matching of the operators. Because I suspect that there might be a problem with the pairing somewhere that is causing the problem.

zhihaofan avatar Jun 14 '24 10:06 zhihaofan