The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. Written in readable and concise style and devoted to key learning problems, the book is intended for statisticians, mathematicians, physicists, and computer scientists.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
EUR 17,51 expédition depuis Royaume-Uni vers France
Destinations, frais et délaisEUR 9,70 expédition depuis Allemagne vers France
Destinations, frais et délaisVendeur : moluna, Greven, Allemagne
Kartoniert / Broschiert. Etat : New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. Written in readable and concise style and devoted to key learning problems, the book is intended for statisticians, mathematicia. N° de réf. du vendeur 4173616
Quantité disponible : Plus de 20 disponibles
Vendeur : Ria Christie Collections, Uxbridge, Royaume-Uni
Etat : New. In. N° de réf. du vendeur ria9781441931603_new
Quantité disponible : Plus de 20 disponibles
Vendeur : BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: \* the setting of learning problems based on the model of minimizing the risk functional from empirical data \* a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency \* non-asymptotic bounds for the risk achieved using the empirical risk minimization principle \* principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds \* the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: \* the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation \* a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of 336 pp. Englisch. N° de réf. du vendeur 9781441931603
Quantité disponible : 2 disponible(s)
Vendeur : GreatBookPricesUK, Woodford Green, Royaume-Uni
Etat : New. N° de réf. du vendeur 14401940-n
Quantité disponible : Plus de 20 disponibles
Vendeur : GreatBookPrices, Columbia, MD, Etats-Unis
Etat : New. N° de réf. du vendeur 14401940-n
Quantité disponible : Plus de 20 disponibles
Vendeur : AHA-BUCH GmbH, Einbeck, Allemagne
Taschenbuch. Etat : Neu. Druck auf Anfrage Neuware - Printed after ordering - The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: \* the setting of learning problems based on the model of minimizing the risk functional from empirical data \* a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency \* non-asymptotic bounds for the risk achieved using the empirical risk minimization principle \* principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds \* the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: \* the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation \* a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of. N° de réf. du vendeur 9781441931603
Quantité disponible : 1 disponible(s)
Vendeur : buchversandmimpf2000, Emtmannsberg, BAYE, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - Print on Demand Titel. Neuware -The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: \* the setting of learning problems based on the model of minimizing the risk functional from empirical data \* a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency \* non-asymptotic bounds for the risk achieved using the empirical risk minimization principle \* principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds \* the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: \* the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation \* a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders ofSpringer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg 336 pp. Englisch. N° de réf. du vendeur 9781441931603
Quantité disponible : 1 disponible(s)
Vendeur : California Books, Miami, FL, Etats-Unis
Etat : New. N° de réf. du vendeur I-9781441931603
Quantité disponible : Plus de 20 disponibles
Vendeur : Lucky's Textbooks, Dallas, TX, Etats-Unis
Etat : New. N° de réf. du vendeur ABLIING23Mar2411530294748
Quantité disponible : Plus de 20 disponibles
Vendeur : GreatBookPricesUK, Woodford Green, Royaume-Uni
Etat : As New. Unread book in perfect condition. N° de réf. du vendeur 14401940
Quantité disponible : Plus de 20 disponibles