Stochastic Gradient Descent Tricks

Leon Bottou

in Neural Networks, Tricks of the Trade, Reloaded

Published by Springer | 2012, Vol 7700 | Neural Networks, Tricks of the Trade, Reloaded edition

View Publication

The first chapter of Neural Networks, Tricks of the Trade strongly advocates the stochastic back-propagation method to train neural networks. This is in fact an instance of a more general technique called stochastic gradient descent. This chapter provides background material, explains why SGD is a good learning algorithm when the training set is large, and provides useful recommendations. This chapter appears in the “reloaded” edition of the tricks book (springer). It completes the material presented in the initial chapter “Efficient Backprop”.