ActivationFunctions
ActivationFunctions copied to clipboard
Implementing activation functions from scratch in Tensorflow.
ActivationFunctions using Custom Layers in Keras
Activation functions are an important are of deep learning research .Many new activation functions are being developed ,these include bio-inspired activtions, purely mathematical activation functions including others . Despite, such advancements we usually find ourselves using RELU and LeakyRELU commonly without using/thinking about others. In the following notebooks I showcase how easy/difficult it is to port an activation function using Custom Layers in Keras and Tensorflow!
Link to main notebook --> Activations.ipynb
Implemented activations:
- LeakyReLu
- ParametricReLu
- Elu
- SElu
- Swish
- GELU
Structure
src
|
|-- Activations.ipynb
|-- utils
|-- Utils.ipynb
|-- utils.py
references
|
|--Ref1
|--Refn
Usage
git clone https://github.com/Agrover112/ActivationFunctions.git