genann icon indicating copy to clipboard operation
genann copied to clipboard

Implement relu

Open kasey- opened this issue 6 years ago • 4 comments

Hello,

I started to implement the relu function for the genann library on a fork under my name (https://github.com/kasey-/genann) before sending you a PR:

double inline genann_act_relu(const struct genann *ann unused, double a) {
    return (a > 0.0) ? a : 0.0;
}

But I am a bit lost in the way you compute the back propagation of the neural network. The derivative of relu formula is trivial (a > 0.0) ? 1.0 : 0.0 But I cannot understand were I should plug-it inside your formula as I do not understand how do you compute your back propagation. Did you implemented only the derivate of the sigmoid ?

kasey- avatar May 23 '19 10:05 kasey-

I would like ReLU in genann too, but the back propagation is also something I don't understand here :(

ilia3101 avatar Sep 08 '19 08:09 ilia3101

Yes, In the code derivate of sigmoid "ddxσ(x)=σ(x)(1−σ(x))" is only implemented. I think we have to write a generic function of derivatives so that, we can add other activation functions like tanh and Relu check the code here https://github.com/kasey-/genann/blob/27c4c4288728791def0c5fd175c1c3999057ad9d/genann.c#L335 If you agree I can also work on this.

msrdinesh avatar Dec 09 '19 02:12 msrdinesh

Hi everyone, To actually implement Relu and linear what exactly should we be looking at other than the back propagation derivative? Would it also require any additional functions like how sigmoid has genann_act_sigmoid_cached and genann_init_sigmoid_lookup? Any advice...

AnnieJohnson25 avatar Mar 05 '21 11:03 AnnieJohnson25

Hi everyone, To actually implement Relu and linear what exactly should we be looking at other than the back propagation derivative? [snip]

It's right there at the top of this bug report thread, double inline genann_act_relu.

doug65536 avatar Mar 12 '23 21:03 doug65536