Gradient Descent is an optimization algorithm that finds the optimal weights that reduces prediction error in a Neural Network. It requires the cost function to be convex e.g have a global minimum otherwise it could get stuck at a local minimum.
Machine Learning A-Z
Thoughts and flashcards derived from Udemy’s course Machine Learning A-Z.