Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Vector quantization is a classical quantization technique from signal processing which allows the modeling of probability density functions by the distribution of prototype vectors. It was originally used for data compression. It works by dividing a large set of points (vectors) into groups having approximately the same number of points closest to them. Each group is represented by its centroid point, as in k-means and some other clustering algorithms. The density matching property of vector quantization is powerful, especially for identifying the density of large and high-dimensioned data. Since data points are represented by the index of their closest centroid, commonly occurring data have low error, and rare data high error. This is why VQ is suitable for lossy data compression. It can also be used for lossy data correction and density estimation. Vector quantization is based on the competitive learning paradigm, so it is closely related to the self-organizing map model.
Les informations fournies dans la section « Synopsis » peuvent faire référence à une autre édition de ce titre.
Vendeur : preigu, Osnabrück, Allemagne
Taschenbuch. Etat : Neu. Vector Quantization | Lambert M. Surhone (u. a.) | Taschenbuch | Englisch | Betascript Publishing | EAN 9786130354572 | Verantwortliche Person für die EU: preigu GmbH & Co. KG, Lengericher Landstr. 19, 49078 Osnabrück, mail[at]preigu[dot]de | Anbieter: preigu. N° de réf. du vendeur 113214623
Quantité disponible : 5 disponible(s)