Imitation-Learning-Dagger-Torcs icon indicating copy to clipboard operation
Imitation-Learning-Dagger-Torcs copied to clipboard

About the convergence and overfit

Open marooncn opened this issue 6 years ago • 0 comments

Hi, thanks for your job and I rewrite it using Keras in the attitude of learning. And I use your recommended hyper-parameters but when I run my program it's apt to overfit. Later on, I change the hyper-parameters , add BN and explicit initialization function of each layer. But it's still overfitting and the car runs 700 steps at the best time but still can't go through the all track. I have spent more than two weeks to tune it. I'm so confused of the tuning, why the same hyper-parameters can't achieve the same result? Why the network is so apt to overfit? For convenience, I update my programmer imitationLearning.py Can you give me some idea? Than you in advance.

marooncn avatar Jun 07 '18 07:06 marooncn