Paddle
Paddle copied to clipboard
[PIR][oneDNN] Fix conv_bias_fuse_pass
PR Category
Others
PR Types
Bug fixes
Description
Current constraints of pass are not enough, and it will fuse some ops inappropriately (like fusing multi-dimension bias into conv, and the fused op will fail at dimension check). Hence we add new constraint to avoid such situations.
你的PR提交成功,感谢你对开源项目的贡献! 请关注后续CI自动化测试结果,详情请参考Paddle-CI手册。 Your PR has been submitted. Thanks for your contribution! Please wait for the result of CI firstly. See Paddle CI Manual for details.
Hi, about ConvBiasFusePass, I have some questions. In previous pass( non-pir implementation), I run ResNet50, and found it not fused in conv_bias pass, but fused in conv_elementwise_add pass. But if I run ResNet50 with PIR pass, it will fused in conv_bias pass. Your fix will constraint condition and make it not fused in conv_bias pass, but I notice after you fix, it still not be fused in conv_elementwise_add pass. So can you check it? Why conv bias/ conv elementwise is it different from the previous non-pir strategy? I'm not sure if our implementation is actually different or wrong under PIR than before.
Hi, about ConvBiasFusePass, I have some questions. In previous pass( non-pir implementation), I run ResNet50, and found it not fused in conv_bias pass, but fused in conv_elementwise_add pass. But if I run ResNet50 with PIR pass, it will fused in conv_bias pass. Your fix will constraint condition and make it not fused in conv_bias pass, but I notice after you fix, it still not be fused in conv_elementwise_add pass. So can you check it? Why conv bias/ conv elementwise is it different from the previous non-pir strategy?
Since I only added constraints for conv_bias_fuse_pass
, it should not block fusion of conv_elementwise_add_pass
. Did you re-run it when disabling bias pass? I guess the result will still be the same (add pass is not fused). But yes, I will check if there is any difference between pir pass and fluid pass of the conv_elementwise_add_pass
as the following issue, thx~