fann
fann copied to clipboard
several improvements
fix: rprop learning new layer type (lecun, relu, leaky relu) L2/L1 norm & regularization possibility to define own error function possibility to weight each sample
Before merging this, I think the three new activation functions should be documented in fann_data.h. Also, corresponding variants should be added to the C++ enum FANN::activation_function_enum in fann_data_cpp.h.
@troiganto Nice review!
I agree that the docs should be updated and c++ headers added. I think it would be also good to have some tests (googletest) or at least some examples before it gets merged.
Affects issue #72
Is there anyone keeping this alive?
Hi @mrrstrat,
I'm still here! I've been writing my doctoral thesis this past year and so swamped with it that I didn't really have time for side projects like this one. I also haven't heard from the other maintainers in a while, unfortunately …
I'd really like to get back to FANN once this is over. (It should be soon. If you haven't heard from me in ~2 months, feel free to ping again.)
Regarding this particular PR, it seems the submitter abandoned it after I requested some changes. I'm not particularly sure what the best process is here. I guess someone else has to incorporate the changes and resubmit the diff as a new PR?
@troiganto,
I have worked with FANN since about 2004 - a couple years ago I manually put in some of the ReLU changes discussed but did not get a lot of comparative regression tests against other transfer function types. Its probably wishful thinking to 100% rely on a rectified linear function to completely solve a dying gradient problem. Handling this in the past involved changing network architecture, network and system parameters, sampling, scope, training, restructure the problem and desired solution. Indeterminate slope derivation can be a funny animal to corral :-).
It looks like the time elapsed enough that another PR is needed, re-introduction of the changes (the main branch and the changes are no longer contemporary with one another but I might be wrong).