nnetcpp icon indicating copy to clipboard operation
nnetcpp copied to clipboard

tests assertion failtues

Open lonnietc opened this issue 8 years ago • 2 comments

Greetings,

I have come across your nnetcpp library and was investigating its use recently.

I was able to compile the code and then ran the "./test all" which has resulted in assertion failures:


. . . Final MSE: 0.273338 test_merge.cpp:62:Assertion Test name: TestMerge::testSum assertion failed

  • Expression: checkLearning(net, input, output, 0.001, 1000)
  • Learning a linear function using MergeSum

test_recurrent.cpp:110:Assertion Test name: TestRecurrent::testLSTM assertion failed

  • Expression: checkLearning(net, inputs[i], outputs[i], target, 1, true, true)
  • A test vector failed the parity test

Failures !!!

Run: 8 Failure total: 2 Failures: 2 Errors: 0

Any ideas on why?

Cheers, Lonnie

lonnietc avatar May 23 '16 18:05 lonnietc

Hi,

Unit testing a neural network library is quite difficult. What I do is that I train a small neural network (based on different layers implemented in the library), and then check that its error is below a threshold. I've been quite aggressive regarding the thresholds.

I've run the tests on my computer, and they pass if run separately "tests recurrent|merge|perceptron". If I run "tests all", some tests fail (always the same ones, but a different set than yours). This is probably because the solution found by the neural network is non-deterministic and depends on the libstdc++ random number generator (for weight initialization for instance). Your RNG may be different than mine, and randomness of one test can influence the others. The library should still be completely functional.

If you use this library and have a network that cannot learn, note nnetcpp is quite sensitive to its parameters (learning rate too high (should be around 1e-4 or 1e-5), too many or too few neurons, wrong activation functions (tanh works very well)). Usually, decreasing the learning rate and giving time to the network is enough to have it learn (by "time", I mean hundreds of thousands of iterations in some cases).

Best regards, Denis

steckdenis avatar May 23 '16 19:05 steckdenis

I came to the same error too.

hailiang-wang avatar Oct 29 '17 11:10 hailiang-wang