Neural Networks: FNN Training Algorithms: Simultaneous perturbation, Backpropagation and Tunneling methods - Couverture souple

Kathirvalavakumar, Thangairulappan

 
9783639300765: Neural Networks: FNN Training Algorithms: Simultaneous perturbation, Backpropagation and Tunneling methods

Synopsis

Artificial neural networks (ANN) are popular machine learning tools that are widely used to solve problems like function approximation, time series prediction, medical diagnosis, character recognition and several optimization problems in various domains of science and engineering. While training ANN, it is essential to have a better optimization method for faster convergence. Simultaneous Perturbation with Stochastic approximation (SPSA) is such a successful optimization method. SPSA provides its power and relative ease of use in difficult multivariate optimization problems and the underlying gradient approximation that requires only two objective function measurements per iteration regardless of the dimension of the optimization problem. This book discusses different neural network learning algorithms to solve classification and non-linear function approximation problems with the combination of simultaneous perturbation, dynamic tunneling techniques, modified back propagation, and neighborhood approach with adaptive learning parameters. Efficiency of these algorithms has been discussed with detailed simulation results for different problems.

Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.

Biographie de l'auteur

Author is working as Associate Professor & Head in the Department of Computer Science, V.H.N.Senthikumara Nadar College, Virudhunagar, TamilNadu, INDIA for the past 23 years. He has been awarded Ph.D Degree by University of Madras, Chennai in the year 2004. His area of interest includes Neural Networks, and Pattern recognition.

Les informations fournies dans la section « A propos du livre » peuvent faire référence à une autre édition de ce titre.