clstm icon indicating copy to clipboard operation
clstm copied to clipboard

What Gradient Descent Method clstm is using?

Open kendemu opened this issue 8 years ago • 1 comments

What Gradient Descent Method clstm is using? SGD? AdaGrad? NAG? RMSProp? Adam? I want to increase the speed of the learning. If clstm is not using adaptive learning rate algorithm, I also have to ask that this method can change the learning rate dynamically to implement adaptive learning rate algorithm:

net.setLearningRate(1e-4,0.9)

kendemu avatar Mar 30 '16 04:03 kendemu

Oh, I found this method in the test-clstm.py. SGD+momenum.

clstm.sgd_update(net)

Is there adagrad, NAG, or faster solver? SGD is quite slow.

kendemu avatar Mar 30 '16 07:03 kendemu