The Basics of Machine Learning Loss Iteration, Gradient Descent, and Learning Rate
In this article, we provide a basic overview of loss iteration, gradient descent, and learning rates. Concisely: loss iteration is looping over different values to try and find increasingly more accurate weights for a model. Gradient descent is the idea that we can find more accurate models by examining the gradient of a given loss, and iterate over those gradients until we find one at the “bottom” of a convex shape. Learning rates are the amount of change in each of our “guesses” that try to get us to the most efficient values for a model