For single-view unsupervised feature selection, we propose two novel methods RUFS and AUFS. RUFS considers outliers in both labeling learning and feature selection thus is more robust than state-of-the-arts. AUFS is proposed such that three desirable properties are satisfied: (1) Sparsity-inducing property; (2) Large weights and small weights are equally penalized; (3) Good balance between small loss on normal data examples and large loss on outliers. For multi-view unsupervised feature selection, we propose to directly utilize raw features in the main view to learn pseudo cluster labels which should also have the most consensus with other views, and meanwhile the discriminative features in the feature selection process will win out to contribute more on label learning process. For multi-view topic discovery, we propose a regularized nonnegative constrained $l_{2,1}$-norm minimization framework as a systematic solution that can integrate information propagation and mutual enhancement between data of different types without supervision in a principled way.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
For single-view unsupervised feature selection, we propose two novel methods RUFS and AUFS. RUFS considers outliers in both labeling learning and feature selection thus is more robust than state-of-the-arts. AUFS is proposed such that three desirable properties are satisfied: (1) Sparsity-inducing property; (2) Large weights and small weights are equally penalized; (3) Good balance between small loss on normal data examples and large loss on outliers. For multi-view unsupervised feature selection, we propose to directly utilize raw features in the main view to learn pseudo cluster labels which should also have the most consensus with other views, and meanwhile the discriminative features in the feature selection process will win out to contribute more on label learning process. For multi-view topic discovery, we propose a regularized nonnegative constrained $l_{2,1}$-norm minimization framework as a systematic solution that can integrate information propagation and mutual enhancement between data of different types without supervision in a principled way.
Dr. Qian received his bachelor's degree and master's degree in control science from Tsinghua University, and his Ph.D. degree in computer science from University of Illinois at Urbana-Champaign. He is a research scientist at Yahoo Labs at Sunnyvale, CA. He is an editorial board member of Information Processing and Management by Elsevier Science.
Les informations fournies dans la section « A propos du livre » peuvent faire référence à une autre édition de ce titre.
Vendeur : BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -For single-view unsupervised feature selection, we propose two novel methods RUFS and AUFS. RUFS considers outliers in both labeling learning and feature selection thus is more robust than state-of-the-arts. AUFS is proposed such that three desirable properties are satisfied: (1) Sparsity-inducing property; (2) Large weights and small weights are equally penalized; (3) Good balance between small loss on normal data examples and large loss on outliers. For multi-view unsupervised feature selection, we propose to directly utilize raw features in the main view to learn pseudo cluster labels which should also have the most consensus with other views, and meanwhile the discriminative features in the feature selection process will win out to contribute more on label learning process. For multi-view topic discovery, we propose a regularized nonnegative constrained $l_{2,1}$-norm minimization framework as a systematic solution that can integrate information propagation and mutual enhancement between data of different types without supervision in a principled way. 144 pp. Englisch. N° de réf. du vendeur 9783659805158
Quantité disponible : 2 disponible(s)
Vendeur : moluna, Greven, Allemagne
Etat : New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. Autor/Autorin: Qian MingjieDr. Qian received his bachelor s degree and master s degree in control science from Tsinghua University, and his Ph.D. degree in computer science from University of Illinois at Urbana-Champaign. He is a research scientist. N° de réf. du vendeur 158876809
Quantité disponible : Plus de 20 disponibles
Vendeur : Revaluation Books, Exeter, Royaume-Uni
Paperback. Etat : Brand New. 144 pages. 8.66x5.91x0.33 inches. In Stock. N° de réf. du vendeur 3659805157
Quantité disponible : 1 disponible(s)
Vendeur : buchversandmimpf2000, Emtmannsberg, BAYE, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - Print on Demand Titel. Neuware -For single-view unsupervised feature selection, we propose two novel methods RUFS and AUFS. RUFS considers outliers in both labeling learning and feature selection thus is more robust than state-of-the-arts. AUFS is proposed such that three desirable properties are satisfied: (1) Sparsity-inducing property; (2) Large weights and small weights are equally penalized; (3) Good balance between small loss on normal data examples and large loss on outliers. For multi-view unsupervised feature selection, we propose to directly utilize raw features in the main view to learn pseudo cluster labels which should also have the most consensus with other views, and meanwhile the discriminative features in the feature selection process will win out to contribute more on label learning process. For multi-view topic discovery, we propose a regularized nonnegative constrained $l_{2,1}$-norm minimization framework as a systematic solution that can integrate information propagation and mutual enhancement between data of different types without supervision in a principled way.VDM Verlag, Dudweiler Landstraße 99, 66123 Saarbrücken 144 pp. Englisch. N° de réf. du vendeur 9783659805158
Quantité disponible : 1 disponible(s)
Vendeur : AHA-BUCH GmbH, Einbeck, Allemagne
Taschenbuch. Etat : Neu. nach der Bestellung gedruckt Neuware - Printed after ordering - For single-view unsupervised feature selection, we propose two novel methods RUFS and AUFS. RUFS considers outliers in both labeling learning and feature selection thus is more robust than state-of-the-arts. AUFS is proposed such that three desirable properties are satisfied: (1) Sparsity-inducing property; (2) Large weights and small weights are equally penalized; (3) Good balance between small loss on normal data examples and large loss on outliers. For multi-view unsupervised feature selection, we propose to directly utilize raw features in the main view to learn pseudo cluster labels which should also have the most consensus with other views, and meanwhile the discriminative features in the feature selection process will win out to contribute more on label learning process. For multi-view topic discovery, we propose a regularized nonnegative constrained $l_{2,1}$-norm minimization framework as a systematic solution that can integrate information propagation and mutual enhancement between data of different types without supervision in a principled way. N° de réf. du vendeur 9783659805158
Quantité disponible : 1 disponible(s)
Vendeur : preigu, Osnabrück, Allemagne
Taschenbuch. Etat : Neu. Unsupervised feature analysis for high dimensional big data | Learning without teachers, an exploration of world in unsupervised data | Mingjie Qian | Taschenbuch | 144 S. | Englisch | 2016 | LAP LAMBERT Academic Publishing | EAN 9783659805158 | Verantwortliche Person für die EU: BoD - Books on Demand, In de Tarpen 42, 22848 Norderstedt, info[at]bod[dot]de | Anbieter: preigu. N° de réf. du vendeur 103936152
Quantité disponible : 5 disponible(s)