Coursera-Ng-Improving-Deep-Neural-Networks-Hyperparameter-tuning-Regularization-and-Optimization
Coursera-Ng-Improving-Deep-Neural-Networks-Hyperparameter-tuning-Regularization-and-Optimization copied to clipboard
Short description for quick search
Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
Course can be found in Coursera
Quiz and answers are collected for quick search in my blog SSQ
- Week 1 Practical aspects of Deep Learning
- Recall that different types of initializations lead to different results
- Recognize the importance of initialization in complex neural networks.
- Recognize the difference between train/dev/test sets
- Diagnose the bias and variance issues in your model
- Learn when and how to use regularization methods such as dropout or L2 regularization.
- Understand experimental issues in deep learning such as Vanishing or Exploding gradients and learn how to deal with them
- Use gradient checking to verify the correctness of your backpropagation implementation
- [x] Initialization
- [x] Regularization
- [x] Gradient Checking
- Week 2 Optimization algorithms
- Remember different optimization methods such as (Stochastic) Gradient Descent, Momentum, RMSProp and Adam
- Use random minibatches to accelerate the convergence and improve the optimization
- Know the benefits of learning rate decay and apply it to your optimization
- [x] Optimization
- Week 3 Hyperparameter tuning, Batch Normalization and Programming Frameworks
- Master the process of hyperparameter tuning
- Master the process of batch Normalization
- [x] Tensorflow