The recent re-emergence of network-based approaches to artificial intelligence has been accomplished by a virtual explosion of research. This research spans a range of disciplines - cognitive science, computer science, biology, neuroscience, electrical engineering, psychology, econometrics, philosophy, etc. which is, perhaps, wider than any other contemporary endeavour. Of all the contributing disciplines, the relatively universal language of mathematics provides some of the most powerful tools for answering fundamental questions about the capabilities and limitations of these `artificial neural networks'. In this collection, Halbert White and his colleagues present a rigorous mathematical analysis of the approximation and learning capabilities of the leading class of single hidden layer feedforward networks. Drawing together work previously scattered in space and time, the book gives a unified view of network learning not available in any other single location, and forges fundamental links between network learning and modern mathematical statistics.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
Vendeur : Zubal-Books, Since 1961, Cleveland, OH, Etats-Unis
Etat : Fine. *Price HAS BEEN REDUCED by 10% until Monday, March 23 (weekend SALE item)* 329 pp., Hardcover, fine in a very good dust jacket. - If you are reading this, this item is actually (physically) in our stock and ready for shipment once ordered. We are not bookjackers. Buyer is responsible for any additional duties, taxes, or fees required by recipient's country. N° de réf. du vendeur ZB1314837
Quantité disponible : 1 disponible(s)