This looks like it has very good potential, but...
This looks like it has very good potential, but I can't seem to use any other activation function other than sigmoid, which my reinforcement learning problem requires more robust activation functions like ReLU/Leaky ReLU. Is there a way to change the derivatives in the code to accommodate different activation functions? Also, will this library gain a built-in way to check loss as a convenience for programmers using this?
Genann is self-contained in two very small and simple files. I suggest you simply copy them to your project, and then make the needed change directly. That's the main advantage of small code like this.
VERY late, sorry. After thinking for a while, I believe that derivative functions should be included in the Genann struct for further customization out of the box for added convenience, of not just a tiny codebase. I mean, I know what ReLU/Leaky ReLU are and their derivatives, and how to implement them in C/C++, but I believe that adding derivative functions upon download would be a minor game changer.