SMU_pytorch
SMU_pytorch copied to clipboard
use SMU
你好! 在使用smu'激活函数的时候,是单独使用smu或者smu1其中的一个激活函数吗? 还是使用完smu后再使用smu1呢 比如 x = smu1(smu(x))?
@HanAccount, First, fix a network with any random activation function (for example ReLU), then replace all the activation functions in the network with SMU or SMU-1 to see the effect on the network performance.
Use either SMU or SMU-1. Like SMU(x) or SMU-1(x).
Do not use SMU-1(SMU(x)). If you will use it in this way, the network may explode or performance may be much worse than ReLU.
thanks!