In this book, we propose novel deterministic RNN training algorithms that adopt a nonmonotone approach. This allows learning behaviour to deteriorate in some iterations; nevertheless the overall learning performance is improved over time. The nonmonotone RNN training methods, which take their theoretical basis from the theory of deterministic nonlinear optimisation, aim at better exploring the search space and enhancing the convergence behaviour of gradient-based methods. They generate nonmonotone behaviour by incorporating conditions that employ forcing functions, which are used to measure the sufficiency of error reduction, and an adaptive window, whose size is informed by estimating the morphology of the error surface locally. The thesis develops nonmonotone 1st- and 2nd-order methods and discusses their convergence properties. The proposed algorithms are applied to training RNNs of various sizes and architectures, namely Feed-Forward Time-Delay networks, Elman Networks and Nonlinear Autoregressive Networks with Exogenous Inputs Networks, in symbolic sequence processing problems. Numerical results show that the proposed nonmonotone learning algorithms train more effectively.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
After granted his PhD (Neural Networks Learning) from the Computer Science and Information Systems Department of the Birkbeck College, University of London, England in 2011, Chun-Cheng Peng is currently a Post-doctoral Research Associate in National Chin-Yi University of Technology, Taiwan. His research interests are mainly in nonmonotone learning.
Les informations fournies dans la section « A propos du livre » peuvent faire référence à une autre édition de ce titre.
EUR 28,83 expédition depuis Royaume-Uni vers France
Destinations, frais et délaisEUR 9,70 expédition depuis Allemagne vers France
Destinations, frais et délaisVendeur : moluna, Greven, Allemagne
Etat : New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. Autor/Autorin: Peng Chun-ChengAfter granted his PhD (Neural Networks Learning) from the Computer Science and Information Systems Department of the Birkbeck College, University of London, England in 2011, Chun-Cheng Peng is currently a Post-doctoral. N° de réf. du vendeur 5502357
Quantité disponible : Plus de 20 disponibles
Vendeur : AHA-BUCH GmbH, Einbeck, Allemagne
Taschenbuch. Etat : Neu. nach der Bestellung gedruckt Neuware - Printed after ordering - In this book, we propose novel deterministic RNN training algorithms that adopt a nonmonotone approach. This allows learning behaviour to deteriorate in some iterations; nevertheless the overall learning performance is improved over time. The nonmonotone RNN training methods, which take their theoretical basis from the theory of deterministic nonlinear optimisation, aim at better exploring the search space and enhancing the convergence behaviour of gradient-based methods. They generate nonmonotone behaviour by incorporating conditions that employ forcing functions, which are used to measure the sufficiency of error reduction, and an adaptive window, whose size is informed by estimating the morphology of the error surface locally. The thesis develops nonmonotone 1st- and 2nd-order methods and discusses their convergence properties. The proposed algorithms are applied to training RNNs of various sizes and architectures, namely Feed-Forward Time-Delay networks, Elman Networks and Nonlinear Autoregressive Networks with Exogenous Inputs Networks, in symbolic sequence processing problems. Numerical results show that the proposed nonmonotone learning algorithms train more effectively. N° de réf. du vendeur 9783846599532
Quantité disponible : 1 disponible(s)
Vendeur : BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -In this book, we propose novel deterministic RNN training algorithms that adopt a nonmonotone approach. This allows learning behaviour to deteriorate in some iterations; nevertheless the overall learning performance is improved over time. The nonmonotone RNN training methods, which take their theoretical basis from the theory of deterministic nonlinear optimisation, aim at better exploring the search space and enhancing the convergence behaviour of gradient-based methods. They generate nonmonotone behaviour by incorporating conditions that employ forcing functions, which are used to measure the sufficiency of error reduction, and an adaptive window, whose size is informed by estimating the morphology of the error surface locally. The thesis develops nonmonotone 1st- and 2nd-order methods and discusses their convergence properties. The proposed algorithms are applied to training RNNs of various sizes and architectures, namely Feed-Forward Time-Delay networks, Elman Networks and Nonlinear Autoregressive Networks with Exogenous Inputs Networks, in symbolic sequence processing problems. Numerical results show that the proposed nonmonotone learning algorithms train more effectively. 256 pp. Englisch. N° de réf. du vendeur 9783846599532
Quantité disponible : 2 disponible(s)
Vendeur : buchversandmimpf2000, Emtmannsberg, BAYE, Allemagne
Taschenbuch. Etat : Neu. Neuware -In this book, we propose novel deterministic RNN training algorithms that adopt a nonmonotone approach. This allows learning behaviour to deteriorate in some iterations; nevertheless the overall learning performance is improved over time. The nonmonotone RNN training methods, which take their theoretical basis from the theory of deterministic nonlinear optimisation, aim at better exploring the search space and enhancing the convergence behaviour of gradient-based methods. They generate nonmonotone behaviour by incorporating conditions that employ forcing functions, which are used to measure the sufficiency of error reduction, and an adaptive window, whose size is informed by estimating the morphology of the error surface locally. The thesis develops nonmonotone 1st- and 2nd-order methods and discusses their convergence properties. The proposed algorithms are applied to training RNNs of various sizes and architectures, namely Feed-Forward Time-Delay networks, Elman Networks and Nonlinear Autoregressive Networks with Exogenous Inputs Networks, in symbolic sequence processing problems. Numerical results show that the proposed nonmonotone learning algorithms train more effectively.Books on Demand GmbH, Überseering 33, 22297 Hamburg 256 pp. Englisch. N° de réf. du vendeur 9783846599532
Quantité disponible : 2 disponible(s)
Vendeur : Books Puddle, New York, NY, Etats-Unis
Etat : New. pp. 256. N° de réf. du vendeur 2698162546
Quantité disponible : 4 disponible(s)
Vendeur : Majestic Books, Hounslow, Royaume-Uni
Etat : New. Print on Demand pp. 256 2:B&W 6 x 9 in or 229 x 152 mm Perfect Bound on Creme w/Gloss Lam. N° de réf. du vendeur 95316141
Quantité disponible : 4 disponible(s)
Vendeur : Biblios, Frankfurt am main, HESSE, Allemagne
Etat : New. PRINT ON DEMAND pp. 256. N° de réf. du vendeur 1898162552
Quantité disponible : 4 disponible(s)
Vendeur : Mispah books, Redhill, SURRE, Royaume-Uni
Paperback. Etat : Like New. Like New. book. N° de réf. du vendeur ERICA75838465995306
Quantité disponible : 1 disponible(s)