Lee / Verleysen | Nonlinear Dimensionality Reduction | E-Book | www2.sack.de
E-Book

E-Book, Englisch, 309 Seiten

Reihe: Information Science and Statistics

Lee / Verleysen Nonlinear Dimensionality Reduction


1. Auflage 2007
ISBN: 978-0-387-39351-3
Verlag: Springer US
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, 309 Seiten

Reihe: Information Science and Statistics

ISBN: 978-0-387-39351-3
Verlag: Springer US
Format: PDF
Kopierschutz: 1 - PDF Watermark



This book describes established and advanced methods for reducing the dimensionality of numerical databases. Each description starts from intuitive ideas, develops the necessary mathematical details, and ends by outlining the algorithmic implementation. The text provides a lucid summary of facts and concepts relating to well-known methods as well as recent developments in nonlinear dimensionality reduction. Methods are all described from a unifying point of view, which helps to highlight their respective strengths and shortcomings. The presentation will appeal to statisticians, computer scientists and data analysts, and other practitioners having a basic background in statistics or computational learning.

Lee / Verleysen Nonlinear Dimensionality Reduction jetzt bestellen!

Weitere Infos & Material


1;Notations;13
2;Acronyms;15
3;High-Dimensional Data;16
3.1;Practical motivations;16
3.1.1;Fields of application;17
3.1.2;The goals to be reached;18
3.2;Theoretical motivations;18
3.2.1;How can we visualize high-dimensional spaces?;19
3.2.2;Curse of dimensionality and empty space phenomenon;21
3.3;Some directions to be explored;24
3.3.1;Relevance of the variables;25
3.3.2;Dependencies between the variables;25
3.4;About topology, spaces, and manifolds;26
3.5;Two benchmark manifolds;29
3.6;Overview of the next chapters;31
4;Characteristics of an Analysis Method;32
4.1;Purpose;32
4.2;Expected functionalities;33
4.2.1;Estimation of the number of latent variables;33
4.2.2;Embedding for dimensionality reduction;34
4.2.3;Embedding for latent variable separation;35
4.3;Internal characteristics;37
4.3.1;Underlying model;37
4.3.2;Algorithm;38
4.3.3;Criterion;38
4.4;Example: Principal component analysis;39
4.4.1;Data model of PCA;39
4.4.2;Criteria leading to PCA;41
4.4.3;Functionalities of PCA;44
4.4.4;Algorithms;46
4.4.5;Examples and limitations of PCA;48
4.5;Toward a categorization of DR methods;52
4.5.1;Hard vs. soft dimensionality reduction;53
4.5.2;Traditional vs. generative model;54
4.5.3;Linear vs. nonlinear model;55
4.5.4;Continuous vs. discrete model;55
4.5.5;Implicit vs. explicit mapping;56
4.5.6;Integrated vs. external estimation of the dimensionality;56
4.5.7;Layered vs. standalone embeddings;57
4.5.8;Single vs. multiple coordinate systems;57
4.5.9;Optional vs. mandatory vector quantization;58
4.5.10;Batch vs. online algorithm;58
4.5.11;Exact vs. approximate optimization;59
4.5.12;The type of criterion to be optimized;59
5;Estimation of the Intrinsic Dimension;61
5.1;Definition of the intrinsic dimension;61
5.2;Fractal dimensions;62
5.2.1;The q-dimension;63
5.2.2;Capacity dimension;65
5.2.3;Information dimension;66
5.2.4;Correlation dimension;67
5.2.5;Some inequalities;68
5.2.6;Practical estimation;69
5.3;Other dimension estimators;73
5.3.1;Local methods;73
5.3.2;Trial and error;74
5.4;Comparisons;76
5.4.1;Data Sets;77
5.4.2;PCA estimator;77
5.4.3;Correlation dimension;77
5.4.4;Local PCA estimator;79
5.4.5;Trial and error;80
5.4.6;Concluding remarks;81
6;Distance Preservation;82
6.1;State-of-the-art;82
6.2;Spatial distances;83
6.2.1;Metric space, distances, norms and scalar product;83
6.2.2;Multidimensional scaling;86
6.2.3;Sammon's nonlinear mapping;95
6.2.4;Curvilinear component analysis;101
6.3;Graph distances;110
6.3.1;Geodesic distance and graph distance;110
6.3.2;Isomap;115
6.3.3;Geodesic NLM;124
6.3.4;Curvilinear distance analysis;127
6.4;Other distances;132
6.4.1;Kernel PCA;133
6.4.2;Semidefinite embedding;138
7;Topology Preservation;145
7.1;State of the art;145
7.2;Predefined lattice;147
7.2.1;Self-Organizing Maps;147
7.2.2;Generative Topographic Mapping;155
7.3;Data-driven lattice;164
7.3.1;Locally linear embedding;164
7.3.2;Laplacian eigenmaps;171
7.3.3;Isotop;177
8;Method comparisons;185
8.1;Toy examples;185
8.1.1;The Swiss roll;185
8.1.2;Manifolds having essential loops or spheres;205
8.2;Cortex unfolding;211
8.3;Image processing;215
8.3.1;Artificial faces;218
8.3.2;Real faces;226
9;Conclusions;236
9.1;Summary of the book;236
9.1.1;The problem;236
9.1.2;A basic solution;237
9.1.3;Dimensionality reduction;237
9.1.4;Latent variable separation;239
9.1.5;Intrinsic dimensionality estimation;240
9.2;Data flow;241
9.2.1;Variable Selection;241
9.2.2;Calibration;241
9.2.3;Linear dimensionality reduction;242
9.2.4;Nonlinear dimensionality reduction;242
9.2.5;Latent variable separation;243
9.2.6;Further processing;243
9.3;Model complexity;243
9.4;Taxonomy;244
9.4.1;Distance preservation;247
9.4.2;Topology preservation;248
9.5;Spectral methods;249
9.6;Nonspectral methods;252
9.7;Tentative methodology;253
9.8;Perspectives;256
10;Matrix Calculus;258
10.1;Singular value decomposition;258
10.2;Eigenvalue decomposition;259
10.3;Square root of a square matrix;259
11;Gaussian Variables;261
11.1;One-dimensional Gaussian distribution;261
11.2;Multidimensional Gaussian distribution;263
11.2.1;Uncorrelated Gaussian variables;264
11.2.2;Isotropic multivariate Gaussian distribution;264
11.2.3;Linearly mixed Gaussian variables;266
12;Optimization;268
12.1;Newton's method;268
12.1.1;Finding extrema;269
12.1.2;Multivariate version;269
12.2;Gradient ascent/descent;270
12.2.1;Stochastic gradient descent;270
13;Vector quantization;272
13.1;Classical techniques;274
13.2;Competitive learning;275
13.3;Taxonomy;275
13.4;Initialization and ``dead units'';276
14;Graph Building;278
14.1;Without vector quantization;279
14.1.1;K-rule;279
14.1.2;-rule;280
14.1.3;-rule;280
14.2;With vector quantization;281
14.2.1;Data rule;281
14.2.2;Histogram rule;283
15;Implementation Issues;286
15.1;Dimension estimation;286
15.1.1;Capacity dimension;286
15.1.2;Correlation dimension;286
15.2;Computation of the closest point(s);288
15.3;Graph distances;289
16;References;292
17;Index;305



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.