Code-for-MPELU icon indicating copy to clipboard operation
Code-for-MPELU copied to clipboard

about using MPELU in pytorch

Open yangkun-2020 opened this issue 3 years ago • 2 comments

I wonder if this torch version of code can run in Pytorch? I use Pytorch. I don't know how to use the module you wrote in pytorch.

yangkun-2020 avatar Dec 13 '21 08:12 yangkun-2020

Sorry, The code is currently for Caffe and Torch. I haven't implemented the PyTorch version. MPELU can be factorized as the production of ELU and PRELU. So it is convenient to write an inefficient version in PyTorch. For the efficient implementation in C++, I found it nontrivial. I will update the code when it's done.

Coldmooon avatar Dec 14 '21 02:12 Coldmooon

Thank you for your reply! Yes, I have implemented similar activation functions with parameters in pytorch by inheriting torch.autograd. Function. An obvious problem is that the computing speed is very slow and takes up a lot of memory. Moreover, when I tested in the network, I found that loss is always NAN. I suspect that there may be a problem with my back-propagation gradient calculation code, which is why I want to obtain the implementation of the original version.

yangkun-2020 avatar Dec 14 '21 02:12 yangkun-2020

Now MPELU has supported pytorch.

Coldmooon avatar Oct 26 '23 13:10 Coldmooon