squeezeDet icon indicating copy to clipboard operation
squeezeDet copied to clipboard

Will random initialization parameters have no precision?

Open JUSTDODoDo opened this issue 6 years ago • 1 comments

When no pre-training model is used, the random initialization parameter is used to train the squeezeDet, and another GPU is verified at the same time, but the training does not have any precision output after hundreds of thousands of verifications. The loss value has been beating directly from 2.2 to 3.0. How can I solve it? Try batchsize=32, lr=0.01, 0.0001 and 0.00001 can't solve, look forward to your answer!

JUSTDODoDo avatar Oct 06 '18 08:10 JUSTDODoDo

The initialization of each layer parameter is initialized in the way of the source code. The input of all pre-training models is removed, but the accuracy is always verified. The loss drops from 28.2 to below 3, and the jitter does not converge.

JUSTDODoDo avatar Oct 06 '18 08:10 JUSTDODoDo