Taygun Kekec
Taygun Kekec
I think initializing weights to 0 is theoretically wrong (Symmetry breaking). From the reference http://ufldl.stanford.edu/wiki/index.php/Backpropagation_Algorithm "If all the parameters start off at identical values, then all the hidden layer units...
Modifying params depends on your problem and input dims. Having a very large input will require to have bigger pooling units Well, normally in the end of conv layers you...
Using global variables is a bad practice. What you should do is to inspect the input shape, and come up with choosing correct batchsize and numberOfChannels from the input data.
for test_example_CNN of the library: Assuming you already runned the network on test data, it will contain activations of the last batch you fed into the network. The following code...
anyone has a progress on this issue?