Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
Sangwoon Yun: PhD in Mathematics at University of Washington. Research interest: Convex and nonsmooth optimization, variational analysis. Research Fellow at National University of Singapore.
Les informations fournies dans la section « A propos du livre » peuvent faire référence à une autre édition de ce titre.
Vendeur : BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. 112 pp. Englisch. N° de réf. du vendeur 9783836478601
Quantité disponible : 2 disponible(s)
Vendeur : moluna, Greven, Allemagne
Kartoniert / Broschiert. Etat : New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. Autor/Autorin: Yun SangwoonSangwoon Yun: PhD in Mathematics at University of Washington. Research interest: Convex and nonsmooth optimization, variational analysis. Research Fellow at National University of Singapore.Nonsmooth optimization pr. N° de réf. du vendeur 5388240
Quantité disponible : Plus de 20 disponibles
Vendeur : Revaluation Books, Exeter, Royaume-Uni
Paperback. Etat : Brand New. 112 pages. 8.66x5.91x0.26 inches. In Stock. N° de réf. du vendeur 3836478609
Quantité disponible : 1 disponible(s)
Vendeur : buchversandmimpf2000, Emtmannsberg, BAYE, Allemagne
Taschenbuch. Etat : Neu. This item is printed on demand - Print on Demand Titel. Neuware -Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems.Books on Demand GmbH, Überseering 33, 22297 Hamburg 112 pp. Englisch. N° de réf. du vendeur 9783836478601
Quantité disponible : 1 disponible(s)
Vendeur : AHA-BUCH GmbH, Einbeck, Allemagne
Taschenbuch. Etat : Neu. nach der Bestellung gedruckt Neuware - Printed after ordering - Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems. N° de réf. du vendeur 9783836478601
Quantité disponible : 1 disponible(s)
Vendeur : preigu, Osnabrück, Allemagne
Taschenbuch. Etat : Neu. A Coordinate Gradient Descent Method for Structured Nonsmooth Optimization | Theory and Applications | Sangwoon Yun | Taschenbuch | 112 S. | Englisch | 2010 | VDM Verlag Dr. Müller | EAN 9783836478601 | Verantwortliche Person für die EU: preigu GmbH & Co. KG, Lengericher Landstr. 19, 49078 Osnabrück, mail[at]preigu[dot]de | Anbieter: preigu Print on Demand. N° de réf. du vendeur 106171326
Quantité disponible : 5 disponible(s)