M S R Dinesh

Results 6 comments of M S R Dinesh

Actually, in the code, the backpropagation algorithm is written only for the sigmoid activation function. We have to change the code for any generic activation function. If no one is...

Hey @codeplea can I work on this issue? I would like to add back prop for tanh and relu activation functions. If no one else is working on this, pls...

Yes, In the code derivate of sigmoid "ddxσ(x)=σ(x)(1−σ(x))" is only implemented. I think we have to write a generic function of derivatives so that, we can add other activation functions...

Hii Team, I am new to Nocodb and want to contribute to it. Can I take this issue if no one else is working on it?