What is Gradient Descent? — AI Encyclopedia | XLUXX

Gradient Descent — The optimization algorithm that trains neural networks. Calculates the slope of the error function and takes steps downhill to find the minimum error. Stochastic Gradient Descent (SGD) uses random batches for efficiency. Adam optimizer is the most popular variant today.

Part of the XLUXX AI Encyclopedia — the most comprehensive AI reference on the web.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *