Python
Python copied to clipboard
Adding Gaussian Error Linear Unit to neural network activation functions
What would you like to share?
I would like to contribute the Gaussian Error Linear Unit (GELU) activation function file in this repository under the neural network activation functions folder.
Additional information
No response
We already have an implementation of the GELU function in maths/gaussian_error_linear_unit.py. I'm aware that the file should be placed in the neural_networks/activation_functions directory, but we don't want duplicate implementations of a single algorithm. Please consider moving the existing file instead of contributing your own implementation.