Python icon indicating copy to clipboard operation
Python copied to clipboard

Adding Gaussian Error Linear Unit to neural network activation functions

Open ParamThakkar123 opened this issue 2 years ago • 1 comments

What would you like to share?

I would like to contribute the Gaussian Error Linear Unit (GELU) activation function file in this repository under the neural network activation functions folder.

Additional information

No response

ParamThakkar123 avatar Dec 09 '23 09:12 ParamThakkar123

We already have an implementation of the GELU function in maths/gaussian_error_linear_unit.py. I'm aware that the file should be placed in the neural_networks/activation_functions directory, but we don't want duplicate implementations of a single algorithm. Please consider moving the existing file instead of contributing your own implementation.

tianyizheng02 avatar Dec 20 '23 01:12 tianyizheng02