Kong / Hu / Duan | Principal Component Analysis Networks and Algorithms | E-Book | www2.sack.de
E-Book

E-Book, Englisch, 339 Seiten

Kong / Hu / Duan Principal Component Analysis Networks and Algorithms


1. Auflage 2017
ISBN: 978-981-10-2915-8
Verlag: Springer Nature Singapore
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, 339 Seiten

ISBN: 978-981-10-2915-8
Verlag: Springer Nature Singapore
Format: PDF
Kopierschutz: 1 - PDF Watermark



This book not only provides a comprehensive introduction to neural-based PCA methods in control science, but also presents many novel PCA algorithms and their extensions and generalizations, e.g., dual purpose, coupled PCA, GED, neural based SVD algorithms, etc. It also discusses in detail various analysis methods for the convergence, stabilizing, self-stabilizing property of algorithms, and introduces the deterministic discrete-time systems method to analyze the convergence of PCA/MCA algorithms. Readers should be familiar with numerical analysis and the fundamentals of statistics, such as the basics of least squares and stochastic algorithms. Although it focuses on neural networks, the book only presents their learning law, which is simply an iterative algorithm. Therefore, no a priori knowledge of neural networks is required. This book will be of interest and serve as a reference source to researchers and students in applied mathematics, statistics, engineering, and other related fields.

Xiangyu Kong, received the B.S. degree in optical engineering from Beijing Institute of Technology, China, in 1990, and Ph.D. degree in control engineering from Xi'an Jiaotong University, China, in 2005. He is currently an associate professor in department of control engineering at Xi'an Institute of Hi-Tech. His research interests include adaptive signal processing, neural networks and feature extraction. He has published two monographs (both as first author) and more than 60 papers, in which nearly 20 articles were published in premier journal including IEEE Trans. Signal Process., IEEE Trans. Neural Netw. and Learning Syst., IEEE Signal Process. Lett., Neural Networks, and so on. He has presided two projected for national natural science foundation of China.Changhua Hu, is currently a professor in Department of Control Engineering at Xi'an Institute of Hi-Tech. His research interests include fault diagnosis in control systems, fault prognostics and predictive maintenance. He has published three monographs and more than 200 papers in premier journals including IEEE Transactions, EJOR, and so on. In 2010, he obtained National Outstanding Young natural science foundation. He was awarded National-class candidate of 'New Century BaiQianWan Talents Program', and National Middle-aged and Young Experts with Outstanding Contributions in 2012. In 2013, he was awarded Cheung Kong Scholar.
Zhansheng Duan, received the B.S. and Ph.D. degrees from Xi'an Jiaotong University, China, in 1999 and 2005, respectively, both in electrical engineering. He also received PhD degree in electrical engineering from the University of New Orleans, in 2010. From January 2010 to April 2010, he worked as an assistant research professor in the Department of Computer Science, University of New Orleans. In July 2010, he joined the Center for Information Engineering Science Research, Xi'an Jiaotong University as an associate professor. His research interests include estimation and detection theory, target tracking, information fusion, nonlinear filtering and performance evaluation. Dr. Duan has co-authored one monograph: Multisource Information Fusion (Tsinghua University Press, 2006), 50 journal and conference proceedings papers. He is also a member of ISIF (International Society of Information Fusion) and the Honor Society of Eta Kappa Nu, and is listed in Who's Who in America 2015.

Kong / Hu / Duan Principal Component Analysis Networks and Algorithms jetzt bestellen!

Weitere Infos & Material


1;Preface;6
1.1;Aim of This book;6
1.2;Novel Algorithms and Extensions;7
1.3;Prerequisites;8
1.4;Outline of the Book;8
1.5;Suggested Sequence of Reading;10
2;Acknowledgments;11
3;Contents;12
4;About the Authors;18
5;1 Introduction;20
5.1;1.1 Feature Extraction;20
5.1.1;1.1.1 PCA and Subspace Tracking;20
5.1.2;1.1.2 PCA Neural Networks;21
5.1.3;1.1.3 Extension or Generalization of PCA;23
5.2;1.2 Basis for Subspace Tracking;24
5.2.1;1.2.1 Concept of Subspace;24
5.2.2;1.2.2 Subspace Tracking Method;27
5.3;1.3 Main Features of This Book;29
5.4;1.4 Organization of This Book;30
5.5;References;31
6;2 Matrix Analysis Basics;36
6.1;2.1 Introduction;36
6.2;2.2 Singular Value Decomposition;37
6.2.1;2.2.1 Theorem and Uniqueness of SVD;37
6.2.2;2.2.2 Properties of SVD;39
6.3;2.3 Eigenvalue Decomposition;41
6.3.1;2.3.1 Eigenvalue Problem and Eigen Equation;41
6.3.2;2.3.2 Eigenvalue and Eigenvector;42
6.3.3;2.3.3 Eigenvalue Decomposition of Hermitian Matrix;47
6.3.4;2.3.4 Generalized Eigenvalue Decomposition;49
6.4;2.4 Rayleigh Quotient and Its Characteristics;53
6.4.1;2.4.1 Rayleigh Quotient;53
6.4.2;2.4.2 Gradient and Conjugate Gradient Algorithm for RQ;54
6.4.3;2.4.3 Generalized Rayleigh Quotient;55
6.5;2.5 Matrix Analysis;57
6.5.1;2.5.1 Differential and Integral of Matrix with Respect to Scalar;57
6.5.2;2.5.2 Gradient of Real Function with Respect to Real Vector;58
6.5.3;2.5.3 Gradient Matrix of Real Function;59
6.5.4;2.5.4 Gradient Matrix of Trace Function;61
6.5.5;2.5.5 Gradient Matrix of Determinant;62
6.5.6;2.5.6 Hessian Matrix;64
6.6;2.6 Summary;64
6.7;References;65
7;3 Neural Networks for Principal Component Analysis;66
7.1;3.1 Introduction;66
7.2;3.2 Review of Neural-Based PCA Algorithms;67
7.3;3.3 Neural-Based PCA Algorithms Foundation;67
7.3.1;3.3.1 Hebbian Learning Rule;67
7.3.2;3.3.2 Oja’s Learning Rule;69
7.4;3.4 Hebbian/Anti-Hebbian Rule-Based Principal Component Analysis;70
7.4.1;3.4.1 Subspace Learning Algorithms;71
7.4.1.1;3.4.1.1 Symmetrical Subspace Learning Algorithm;71
7.4.1.2;3.4.1.2 Weighted Subspace Learning Algorithm;72
7.4.2;3.4.2 Generalized Hebbian Algorithm;72
7.4.3;3.4.3 Learning Machine for Adaptive Feature Extraction via PCA;73
7.4.4;3.4.4 The Dot-Product-Decorrelation Algorithm (DPD);74
7.4.5;3.4.5 Anti-Hebbian Rule-Based Principal Component Analysis;74
7.4.5.1;3.4.5.1 Rubner-Tavan PCA Algorithm;75
7.4.5.2;3.4.5.2 APEX Algorithm;75
7.5;3.5 Least Mean Squared Error-Based Principal Component Analysis;76
7.5.1;3.5.1 Least Mean Square Error Reconstruction Algorithm (LMSER);77
7.5.2;3.5.2 Projection Approximation Subspace Tracking Algorithm (PAST);78
7.5.3;3.5.3 Robust RLS Algorithm (RRLSA);79
7.6;3.6 Optimization-Based Principal Component Analysis;80
7.6.1;3.6.1 Novel Information Criterion (NIC) Algorithm;80
7.6.2;3.6.2 Coupled Principal Component Analysis;81
7.7;3.7 Nonlinear Principal Component Analysis;82
7.7.1;3.7.1 Kernel Principal Component Analysis;83
7.7.2;3.7.2 Robust/Nonlinear Principal Component Analysis;85
7.7.3;3.7.3 Autoassociative Network-Based Nonlinear PCA;87
7.8;3.8 Other PCA or Extensions of PCA;87
7.9;3.9 Summary;89
7.10;References;90
8;4 Neural Networks for Minor Component Analysis;93
8.1;4.1 Introduction;93
8.2;4.2 Review of Neural Network-Based MCA Algorithms;94
8.2.1;4.2.1 Extracting the First Minor Component;95
8.2.2;4.2.2 Oja’s Minor Subspace Analysis;97
8.2.3;4.2.3 Self-stabilizing MCA;97
8.2.4;4.2.4 Orthogonal Oja Algorithm;98
8.2.5;4.2.5 Other MCA Algorithm;98
8.3;4.3 MCA EXIN Linear Neuron;99
8.3.1;4.3.1 The Sudden Divergence;99
8.3.2;4.3.2 The Instability Divergence;101
8.3.3;4.3.3 The Numerical Divergence;101
8.4;4.4 A Novel Self-stabilizing MCA Linear Neurons;102
8.4.1;4.4.1 A Self-stabilizing Algorithm for Tracking One MC;102
8.4.2;4.4.2 MS Tracking Algorithm;108
8.4.3;4.4.3 Computer Simulations;112
8.5;4.5 Total Least Squares Problem Application;116
8.5.1;4.5.1 A Novel Neural Algorithm for Total Least Squares Filtering;116
8.5.2;4.5.2 Computer Simulations;123
8.6;4.6 Summary;124
8.7;References;125
9;5 Dual Purpose for Principal and Minor Component Analysis;129
9.1;5.1 Introduction;129
9.2;5.2 Review of Neural Network-Based Dual-Purpose Methods;131
9.2.1;5.2.1 Chen’s Unified Stabilization Approach;131
9.2.2;5.2.2 Hasan’s Self-normalizing Dual Systems;132
9.2.3;5.2.3 Peng’s Unified Learning Algorithm to Extract Principal and Minor Components;135
9.2.4;5.2.4 Manton’s Dual-Purpose Principal and Minor Component Flow;136
9.3;5.3 A Novel Dual-Purpose Method for Principal and Minor Subspace Tracking;137
9.3.1;5.3.1 Preliminaries;137
9.3.1.1;5.3.1.1 Definitions and Properties;137
9.3.1.2;5.3.1.2 Conventional Formulation for PSA or MSA;138
9.3.2;5.3.2 A Novel Information Criterion and Its Landscape;140
9.3.2.1;5.3.2.1 A Novel Criterion for PSA and MSA;140
9.3.2.2;5.3.2.2 Landscape of Nonquadratic Criteria;140
9.3.3;5.3.3 Dual-Purpose Subspace Gradient Flow;145
9.3.3.1;5.3.3.1 Dual Purpose Gradient Flow;145
9.3.3.2;5.3.3.2 Convergence Analysis;146
9.3.3.3;5.3.3.3 Self-stability Property Analysis;148
9.3.4;5.3.4 Global Convergence Analysis;149
9.3.5;5.3.5 Numerical Simulations;150
9.3.5.1;5.3.5.1 Self-stabilizing Property and Convergence;150
9.3.5.2;5.3.5.2 The Contrasts with Other Algorithms;152
9.3.5.3;5.3.5.3 Examples from Practical Applications of Our Unified Algorithm;153
9.4;5.4 Another Novel Dual-Purpose Algorithm for Principal and Minor Subspace Analysis;156
9.4.1;5.4.1 The Criterion for PSA and MSA and Its Landscape;157
9.4.2;5.4.2 Dual-Purpose Algorithm for PSA and MSA;159
9.4.3;5.4.3 Experimental Results;160
9.4.3.1;5.4.3.1 Simulation Experiment;160
9.4.3.2;5.4.3.2 Real Application Experiment;161
9.5;5.5 Summary;164
9.6;References;164
10;6 Deterministic Discrete-Time System for the Analysis of Iterative Algorithms;167
10.1;6.1 Introduction;167
10.2;6.2 Review of Performance Analysis Methods for Neural Network-Based PCA Algorithms;168
10.2.1;6.2.1 Deterministic Continuous-Time System Method;168
10.2.2;6.2.2 Stochastic Discrete-Time System Method;169
10.2.3;6.2.3 Lyapunov Function Approach;172
10.2.4;6.2.4 Deterministic Discrete-Time System Method;173
10.3;6.3 DDT System of a Novel MCA Algorithm;173
10.3.1;6.3.1 Self-stabilizing MCA Extraction Algorithms;173
10.3.2;6.3.2 Convergence Analysis via DDT System;174
10.3.3;6.3.3 Computer Simulations;183
10.4;6.4 DDT System of a Unified PCA and MCA Algorithm;185
10.4.1;6.4.1 Introduction;186
10.4.2;6.4.2 A Unified Self-stabilizing Algorithm for PCA and MCA;186
10.4.3;6.4.3 Convergence Analysis;187
10.4.4;6.4.4 Computer Simulations;198
10.5;6.5 Summary;200
10.6;References;201
11;7 Generalized Principal Component Analysis;203
11.1;7.1 Introduction;203
11.2;7.2 Review of Generalized Feature Extraction Algorithm;205
11.2.1;7.2.1 Mathew’s Quasi-Newton Algorithm for Generalized Symmetric Eigenvalue Problem;205
11.2.2;7.2.2 Self-organizing Algorithms for Generalized Eigen Decomposition;207
11.2.3;7.2.3 Fast RLS-like Algorithm for Generalized Eigen Decomposition;208
11.2.4;7.2.4 Generalized Eigenvector Extraction Algorithm Based on RLS Method;209
11.2.5;7.2.5 Fast Adaptive Algorithm for the Generalized Symmetric Eigenvalue Problem;212
11.2.6;7.2.6 Fast Generalized Eigenvector Tracking Based on the Power Method;214
11.2.7;7.2.7 Generalized Eigenvector Extraction Algorithm Based on Newton Method;216
11.2.8;7.2.8 Online Algorithms for Extracting Minor Generalized Eigenvector;218
11.3;7.3 A Novel Minor Generalized Eigenvector Extraction Algorithm;220
11.3.1;7.3.1 Algorithm Description;221
11.3.2;7.3.2 Self-stabilizing Analysis;222
11.3.3;7.3.3 Convergence Analysis;223
11.3.4;7.3.4 Computer Simulations;231
11.4;7.4 Novel Multiple GMC Extraction Algorithm;235
11.4.1;7.4.1 An Inflation Algorithm for Multiple GMC Extraction;235
11.4.2;7.4.2 A Weighted Information Criterion and Corresponding Multiple GMC Extraction;238
11.4.3;7.4.3 Simulations and Application Experiments;246
11.5;7.5 Summary;248
11.6;References;249
12;8 Coupled Principal Component Analysis;252
12.1;8.1 Introduction;252
12.2;8.2 Review of Coupled Principal Component Analysis;254
12.2.1;8.2.1 Moller’s Coupled PCA Algorithm;254
12.2.2;8.2.2 Nguyen’s Coupled Generalized Eigen pairs Extraction Algorithm;255
12.2.3;8.2.3 Coupled Singular Value Decomposition of a Cross-Covariance Matrix;260
12.3;8.3 Unified and Coupled Algorithm for Minor and Principal Eigen Pair Extraction;260
12.3.1;8.3.1 Couple Dynamical System;261
12.3.2;8.3.2 The Unified and Coupled Learning Algorithms;263
12.3.2.1;8.3.2.1 Coupled MCA Algorithms;263
12.3.2.2;8.3.2.2 Coupled PCA Algorithms;264
12.3.2.3;8.3.2.3 Multiple Eigen Pairs Estimation;265
12.3.3;8.3.3 Analysis of Convergence and Self-stabilizing Property;267
12.3.4;8.3.4 Simulation Experiments;269
12.4;8.4 Adaptive Coupled Generalized Eigen Pairs Extraction Algorithms;274
12.4.1;8.4.1 A Coupled Generalized System for GMCA and GPCA;274
12.4.2;8.4.2 Adaptive Implementation of Coupled Generalized Systems;279
12.4.3;8.4.3 Convergence Analysis;282
12.4.4;8.4.4 Numerical Examples;288
12.5;8.5 Summary;295
12.6;References;295
13;9 Singular Feature Extraction and Its Neural Networks;297
13.1;9.1 Introduction;297
13.2;9.2 Review of Cross-Correlation Feature Method;299
13.2.1;9.2.1 Cross-Correlation Neural Networks Model and Deflation Method;299
13.2.2;9.2.2 Parallel SVD Learning Algorithms on Double Stiefel Manifold;302
13.2.3;9.2.3 Double Generalized Hebbian Algorithm (DGHA) for SVD;304
13.2.4;9.2.4 Cross-Associative Neural Network for SVD(CANN);305
13.2.5;9.2.5 Coupled SVD of a Cross-Covariance Matrix;307
13.2.5.1;9.2.5.1 Single-Component Learning Rules;308
13.2.5.2;9.2.5.2 Multiple Component Learning Rules;309
13.3;9.3 An Effective Neural Learning Algorithm for Extracting Cross-Correlation Feature;310
13.3.1;9.3.1 Preliminaries;311
13.3.1.1;9.3.1.1 Definitions and Properties;311
13.3.1.2;9.3.1.2 Some Formulations Relative to PSS;311
13.3.2;9.3.2 Novel Information Criterion Formulation for PSS;313
13.3.2.1;9.3.2.1 Novel Information Criterion Formulation for PSS;313
13.3.2.2;9.3.2.2 Landscape of Nonquadratic Criterion;314
13.3.2.3;9.3.2.3 Remarks and Comparisons;320
13.3.3;9.3.3 Adaptive Learning Algorithm and Performance Analysis;321
13.3.3.1;9.3.3.1 Adaptive Learning Algorithm;321
13.3.3.2;9.3.3.2 Convergence Analysis;322
13.3.3.3;9.3.3.3 Self-Stability Property Analysis;323
13.3.4;9.3.4 Computer Simulations;324
13.4;9.4 Coupled Cross-Correlation Neural Network Algorithm for Principal Singular Triplet Extraction of a Cross-Covariance Matrix;327
13.4.1;9.4.1 A Novel Information Criterion and a Coupled System;328
13.4.2;9.4.2 Online Implementation and Stability Analysis;331
13.4.3;9.4.3 Simulation Experiments;332
13.4.3.1;9.4.3.1 Experiment 1;332
13.4.3.2;9.4.3.2 Experiment 2;334
13.5;9.5 Summary;336



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.