SMU_pytorch
SMU_pytorch copied to clipboard
A Pytorch implementation of SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE
@iFe1er Hi, I am the author of the paper and I have updated the paper. Please update the parameters in your repository as well as people using your code. I...
I use SMU instead of SILU in YoloV5, but loss shows up as nan. Could you please tell me the possible reason?Or maybe it's normal that this happened in previous...
mu的初始值
请问SMU中mu的初始值该如何确定?必须是1000000吗?
use SMU
你好! 在使用smu'激活函数的时候,是单独使用smu或者smu1其中的一个激活函数吗? 还是使用完smu后再使用smu1呢 比如 x = smu1(smu(x))?
你好,很棒的工作。请问smu能转成tensorrt吗?谢谢