Topics: The following topics are covered in this module:
- Gradient Descent
- Transfer Learning
- Underfitting and Overfitting
- Fine Tuning
Objectives
By the end of this module, students will be able to:
- Describe the purpose and process of gradient descent.
- Discuss the error loss function.
- Describe optimizers.
- Recognize signs of underfitting and overfitting.
- Experiment with hyperparameter tuning.
Watch
Video: How to Tune Your Models
Topics
Hands-on Exercise
Through this exercise, you’ll gain hands-on experience applying deep learning to solve another image recognition task. As discussed above, the loss function, optimizer, and more are all hyperparameters that can be adjusted to try to train a better model. You will get some experience with hyperparameter optimization.
Work on notebook 03_bees_vs_wasps.ipynb.
Conclusion
In this module, we explored essential concepts and techniques in machine learning optimization:
- Basics of Gradient Descent: Introduced the algorithm as a method for minimizing loss functions in machine learning models.
- Loss Functions: Discussed various types, such as Mean Squared Error and Cross-Entropy, highlighting their role in model accuracy.
- Optimizers: Covered different optimizers like Stochastic Gradient Descent (SGD) and Adam, focusing on their application in refining the learning process.
- Transfer Learning & Fine Tuning: Touched on these important, advanced deep learning techniques.
- Deep Learning Implementation: Explored hyperparameter optimization to train better models.
This module provided a foundational understanding of how gradient descent drives the learning process in machine learning and deep learning models.