Gradient Descent
Gradient Descent
Gradient descent is one of the most popular algorithms to perform optimization and by far the most common way to optimize neural networks.
Gradient Descent is a technique to minimize loss by computing the gradients of loss with respect to the model’s parameters, conditioned on training data. Informally, gradient descent iteratively adjusts parameters, gradually finding the best combination of weights and bias to minimize loss.