Many theoretical and experimental studies have shown that a multiple classi?er system is an e?ective technique for reducing prediction errors [9,10,11,20,19]. These studies identify mainly three elements that characterize a set of cl- si?ers: -Therepresentationoftheinput(whateachindividualclassi?erreceivesby wayofinput). -Thearchitectureoftheindividualclassi?ers(algorithmsandparametri- tion). - The way to cause these classi?ers to take a decision together. Itcanbeassumedthatacombinationmethodise?cientifeachindividualcl- si?ermakeserrors'inadi?erentway', sothatitcanbeexpectedthatmostofthe classi?ers can correct the mistakes that an individual one does [1,19]. The term 'weak classi?ers' refers to classi?ers whose capacity has been reduced in some way so as to increase their prediction diversity. Either their internal architecture issimple(e.g., theyusemono-layerperceptronsinsteadofmoresophisticated neural networks), or they are prevented from using all the information available. Sinceeachclassi?erseesdi?erentsectionsofthelearningset, theerrorcorre- tion among them is reduced. It has been shown that the majority vote is the beststrategyiftheerrorsamongtheclassi?ersarenotcorrelated.Moreover, in real applications, the majority vote also appears to be as e?cient as more sophisticated decision rules [2,13]. Onemethodofgeneratingadiversesetofclassi?ersistoupsetsomeaspect ofthetraininginputofwhichtheclassi?erisrather unstable. In the present paper, westudytwodistinctwaystocreatesuchweakenedclassi?ers;i.e.learning set resampling (using the 'Bagging' approach [5]), and random feature subset selection (using 'MFS', a Multiple Feature Subsets approach [3]). Other recent and similar techniques are not discussed here but are also based on modi?cations to the training and/or the feature set [7,8,12,21].
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
Vendeur : GuthrieBooks, Spring Branch, TX, Etats-Unis
Paperback. Etat : Very Good. Ex-library paperback in very nice condition with the usual markings and attachments. N° de réf. du vendeur DA1411791
Quantité disponible : 1 disponible(s)
Vendeur : Majestic Books, Hounslow, Royaume-Uni
Etat : New. pp. 424 Illus. N° de réf. du vendeur 5851051
Quantité disponible : 1 disponible(s)
Vendeur : Books Puddle, New York, NY, Etats-Unis
Etat : New. pp. 424. N° de réf. du vendeur 263078260
Quantité disponible : 1 disponible(s)
Vendeur : Biblios, Frankfurt am main, HESSE, Allemagne
Etat : New. pp. 424. N° de réf. du vendeur 183078270
Quantité disponible : 1 disponible(s)
Vendeur : Lucky's Textbooks, Dallas, TX, Etats-Unis
Etat : New. N° de réf. du vendeur ABLIING23Mar3113020174725
Quantité disponible : Plus de 20 disponibles
Vendeur : GreatBookPrices, Columbia, MD, Etats-Unis
Etat : New. N° de réf. du vendeur 919229-n
Quantité disponible : Plus de 20 disponibles
Vendeur : Ria Christie Collections, Uxbridge, Royaume-Uni
Etat : New. In. N° de réf. du vendeur ria9783540677048_new
Quantité disponible : Plus de 20 disponibles
Vendeur : Chiron Media, Wallingford, Royaume-Uni
Paperback. Etat : New. N° de réf. du vendeur 6666-IUK-9783540677048
Quantité disponible : Plus de 20 disponibles
Vendeur : GreatBookPricesUK, Woodford Green, Royaume-Uni
Etat : New. N° de réf. du vendeur 919229-n
Quantité disponible : Plus de 20 disponibles
Vendeur : BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -Many theoretical and experimental studies have shown that a multiple classi er system is an e ective technique for reducing prediction errors [9,10,11,20,19]. These studies identify mainly three elements that characterize a set of cl- si ers: Therepresentationoftheinput(whateachindividualclassi erreceivesby wayofinput). Thearchitectureoftheindividualclassi ers(algorithmsandparametri- tion). The way to cause these classi ers to take a decision together. Itcanbeassumedthatacombinationmethodise cientifeachindividualcl- si ermakeserrors inadi erentway ,sothatitcanbeexpectedthatmostofthe classi ers can correct the mistakes that an individual one does [1,19]. The term weak classi ers refers to classi ers whose capacity has been reduced in some way so as to increase their prediction diversity. Either their internal architecture issimple(e.g.,theyusemono-layerperceptronsinsteadofmoresophisticated neural networks), or they are prevented from using all the information available. Sinceeachclassi erseesdi erentsectionsofthelearningset,theerrorcorre- tion among them is reduced. It has been shown that the majority vote is the beststrategyiftheerrorsamongtheclassi ersarenotcorrelated.Moreover, in real applications, the majority vote also appears to be as e cient as more sophisticated decision rules [2,13]. Onemethodofgeneratingadiversesetofclassi ersistoupsetsomeaspect ofthetraininginputofwhichtheclassi erisrather unstable. In the present paper,westudytwodistinctwaystocreatesuchweakenedclassi ers;i.e.learning set resampling (using the Bagging approach [5]), and random feature subset selection (using MFS , a Multiple Feature Subsets approach [3]). Other recent and similar techniques are not discussed here but are also based on modi cations to the training and/or the feature set [7,8,12,21]. 424 pp. Englisch. N° de réf. du vendeur 9783540677048
Quantité disponible : 2 disponible(s)