Author: Okewu, Emmanuel; Misra, Sanjay; Lius, Fernandez-Sanz
Title: Parameter Tuning Using Adaptive Moment Estimation in Deep Learning Neural Networks Cord-id: l28nouv8 Document date: 2020_8_24
ID: l28nouv8
Snippet: The twin issues of loss quality (accuracy) and training time are critical in choosing a stochastic optimizer for training deep neural networks. Optimization methods for machine learning include gradient descent, simulated annealing, genetic algorithm and second order techniques like Newton’s method. However, the popular method for optimizing neural networks is gradient descent. Overtime, researchers have made gradient descent more responsive to the requirements of improved quality loss (accura
Document: The twin issues of loss quality (accuracy) and training time are critical in choosing a stochastic optimizer for training deep neural networks. Optimization methods for machine learning include gradient descent, simulated annealing, genetic algorithm and second order techniques like Newton’s method. However, the popular method for optimizing neural networks is gradient descent. Overtime, researchers have made gradient descent more responsive to the requirements of improved quality loss (accuracy) and reduced training time by progressing from using simple learning rate to using adaptive moment estimation technique for parameter tuning. In this work, we investigate the performances of established stochastic gradient descent algorithms like Adam, RMSProp, Adagrad, and Adadelta in terms of training time and loss quality. We show practically, using series of stochastic experiments, that adaptive moment estimation has improved the gradient descent optimization method. Based on the empirical outcomes, we recommend further improvement of the method by using higher moments of gradient for parameter tuning (weight update). The output of our experiments also indicate that neural network is a stochastic algorithm.
Search related documents:
Co phrase search for related documents- Try single phrases listed below for: 1
Co phrase search for related documents, hyperlinks ordered by date