Kohonen | Self-Organizing Maps | E-Book | www2.sack.de
E-Book

E-Book, Englisch, Band 30, 426 Seiten, eBook

Reihe: Springer Series in Information Sciences

Kohonen Self-Organizing Maps


2. Auflage 1997
ISBN: 978-3-642-97966-8
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, Band 30, 426 Seiten, eBook

Reihe: Springer Series in Information Sciences

ISBN: 978-3-642-97966-8
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark



The second, revised edition of this book was suggested by the impressive sales of the first edition. Fortunately this enabled us to incorporate new important results that had just been obtained. The ASSOM (Adaptive-Subspace SOM) is a new architecture in which invariant-feature detectors emerge in an unsupervised learning process. Its basic principle was already introduced in the first edition, but the motiva tion and theoretical discussion in the second edition is more thorough and consequent. New material has been added to Sect. 5.9 and this section has been rewritten totally. Correspondingly, Sect. 1.4, which deals with adaptive subspace classifiers in general and constitutes the prerequisite for the ASSOM principle, has also been extended and rewritten totally. Another new SOM development is the WEBSOM, a two-layer architecture intended for the organization of very large collections of full-text documents such as those found in the Internet and World Wide Web. This architecture was published after the first edition came out. The idea and results seemed to be so important that the new Sect. 7.8 has now been added to the second edition. Another addition that contains new results is Sect. 3.15, which describes the acceleration in the computing of very large SOMs. It was also felt that Chap. 7, which deals with 80M applications, had to be extended.

Kohonen Self-Organizing Maps jetzt bestellen!

Zielgruppe


Research


Autoren/Hrsg.


Weitere Infos & Material


1. Mathematical Preliminaries.- 1.1 Mathematical Concepts and Notations.- 1.1.1 Vector Space Concepts.- 1.1.2 Matrix Notations.- 1.1.3 Further Properties of Matrices.- 1.1.4 On Matrix Differential Calculus.- 1.2 Distance Measures for Patterns.- 1.2.1 Measures of Similarity and Distance in Vector Spaces.- 1.2.2 Measures of Similarity and Distance Between Symbol Strings.- 1.3 Statistical Pattern Recognition.- 1.3.1 Supervised Classification.- 1.3.2 Unsupervised Classification.- 1.4 The Subspace Methods of Classification.- 1.4.1 The Basic Subspace Method.- 1.4.2 Adaptation of a Subspace to Input Statistics.- 1.4.3 The Learning Subspace Method (LSM).- 1.5 The Robbins-Monro Stochastic Approximation.- 1.5.1 The Adaptive Linear Element.- 1.5.2 Vector Quantization.- 1.6 Dynamically Expanding Context.- 1.6.1 Setting Up the Problem.- 1.6.2 Automatic Determination of Context-Independent Productions.- 1.6.3 Conflict Bit.- 1.6.4 Construction of Memory for the Context-Dependent Productions.- 1.6.5 The Algorithm for the Correction of New Strings.- 1.6.6 Estimation Procedure for Unsuccessful Searches.- 1.6.7 Practical Experiments.- 2. Justification of Neural Modeling.- 2.1 Models, Paradigms, and Methods.- 2.2 On the Complexity of Biological Nervous Systems.- 2.3 Relation Between Biological and Artificial Neural Networks.- 2.4 What Functions of the Brain Are Usually Modeled?.- 2.5 When Do We Have to Use Neural Computing?.- 2.6 Transformation, Relaxation, and Decoder.- 2.7 Categories of ANNs.- 2.8 Competitive-Learning Networks.- 2.9 Three Phases of Development of Neural Models.- 2.10 A Simple Nonlinear Dynamic Model of the Neuron.- 2.11 Learning Laws.- 2.11.1 Hebb’s Law.- 2.11.2 The Riccati-Type Learning Law.- 2.11.3 The PCA-Type Learning Law.- 2.12 Brain Maps.- 3. The Basic SOM.- 3.1 The SOM Algorithm in the Euclidean Space.- 3.2 The “Dot-Product SOM”.- 3.3 Preliminary Demonstrations of Topology-Preserving Mappings.- 3.3.1 Ordering of Reference Vectors in the Input Space.- 3.3.2 Demonstrations of Ordering of Responses in the Output Plane.- 3.4 Basic Mathematical Approaches to Self-Organization.- 3.4.1 One-Dimensional Case.- 3.4.2 Constructive Proof of Ordering of Another One-Dimensional SOM.- 3.4.3 An Attempt to Justify the SOM Algorithm for General Dimensionalities.- 3.5 Initialization of the SOM Algorithms.- 3.6 On the “Optimal” Learning-Rate Factor.- 3.7 Effect of the Form of the Neighborhood Function.- 3.8 Magnification Factor.- 3.9 Practical Advice for the Construction of Good Maps.- 3.10 Examples of Data Analyses Implemented by the SOM.- 3.10.1 Attribute Maps with Full Data Matrix.- 3.10.2 Case Example of Attribute Maps Based on Incomplete Data Matrices (Missing Data): “Poverty Map”.- 3.11 Using Gray Levels to Indicate Clusters in the SOM.- 3.12 Derivation of the SOM Algorithm in the General Metric.- 3.13 What Kind of SOM Actually Ensues from the Distortion Measure?.- 3.14 Batch Computation of the SOM (“Batch Map”).- 3.15 Further Speedup of SOM Computation.- 3.15.1 Shortcut Winner Search.- 3.15.2 Increasing the Number of Units in the SOM.- 3.15.3 Smoothing.- 3.15.4 Combination of Smoothing, Lattice Growing, and SOM Algorithm.- 4. Physiological Interpretation of SOM.- 4.1 Two Different Lateral Control Mechanisms.- 4.1.1 The WTA Function, Based on Lateral Activity Control.- 4.1.2 Lateral Control of Plasticity.- 4.2 Learning Equation.- 4.3 System Models of SOM and Their Simulations.- 4.4 Recapitulation of the Features of the Physiological SOM Model.- 5. Variants of SOM.- 5.1 Overview of Ideas to Modify the Basic SOM.- 5.2 Adaptive Tensorial Weights.- 5.3 Tree-Structured SOM in Searching.- 5.4 Different Definitions of the Neighborhood.- 5.5 Neighborhoods in the Signal Space.- 5.6 Dynamical Elements Added to the SOM.- 5.7 Operator Maps.- 5.8 Supervised SOM.- 5.9 The Adaptive-Subspace SOM (ASSOM).- 5.9.1 The Problem of Invariant Features.- 5.9.2 Relation Between Invariant Features and Linear Subspaces.- 5.9.3 The ASSOM Algorithm.- 5.9.4 Derivation of the ASSOM Algorithm by Stochastic Approximation.- 5.9.5 ASSOM Experiments.- 5.10 Feedback-Controlled Adaptive-Subspace SOM (FASSOM).- 6. Learning Vector Quantization.- 6.1 Optimal Decision.- 6.2 The LVQ1.- 6.3 The Optimized-Learning-Rate LVQ1 (OLVQ1).- 6.4 The LVQ2 (LVQ2.1).- 6.5 The LVQ3.- 6.6 Differences Between LVQ1, LVQ2 and LVQ3.- 6.7 General Considerations.- 6.8 The Hypermap-Type LVQ.- 6.9 The “LVQ-SOM”.- 7. Applications.- 7.1 Preprocessing of Optic Patterns.- 7.1.1 Blurring.- 7.1.2 Expansion in Terms of Global Features.- 7.1.3 Spectral Analysis.- 7.1.4 Expansion in Terms of Local Features (Wavelets).- 7.1.5 Recapitulation of Features of Optic Patterns.- 7.2 Acoustic Preprocessing.- 7.3 Process and Machine Monitoring.- 7.3.1 Selection of Input Variables and Their Scaling.- 7.3.2 Analysis of Large Systems.- 7.4 Diagnosis of Speech Voicing.- 7.5 Transcription of Continuous Speech.- 7.6 Texture Analysis.- 7.7 Contextual Maps.- 7.7.1 Role-Based Semantic Map.- 7.7.2 Unsupervised Categorization of Phonemic Classes from Text.- 7.8 Organization of Large Document Files.- 7.9 Robot-Arm Control.- 7.9.1 Simultaneous Learning of Input and Output Parameters.- 7.9.2 Another Simple Robot-Arm Control.- 7.10 Telecommunications.- 7.10.1 Adaptive Detector for Quantized Signals.- 7.10.2 Channel Equalization in the Adaptive QAM.- 7.10.3 Error-Tolerant Transmission of Images by a Pair of SOMs.- 7.11 The SOM as an Estimator.- 7.11.1 Symmetric (Autoassociative) Mapping.- 7.11.2 Asymmetric (Heteroassociative) Mapping.- 8. Hardware for SOM.- 8.1 An Analog Classifier Circuit.- 8.2 Fast Digital Classifier Circuits.- 8.3 SIMD Implementation of SOM.- 8.4 Transputer Implementation of SOM.- 8.5 Systolic-Array Implementation of SOM.- 8.6 The COKOS Chip.- 8.7 The TInMANN Chip.- 9. An Overview of SOM Literature.- 9.1 General.- 9.2 Early Works on Competitive Learning.- 9.3 Status of the Mathematical Analyses.- 9.4 Survey of General Aspects of the SOM.- 9.4.1 General.- 9.4.2 Mathematical Derivations, Analyses, and Modifications of the SOM.- 9.5 Modifications and Analyses of LVQ.- 9.6 Survey of Diverse Applications of SOM.- 9.6.1 Machine Vision and Image Analysis.- 9.6.2 Optical Character and Script Reading.- 9.6.3 Speech Analysis and Recognition.- 9.6.4 Acoustic and Musical Studies.- 9.6.5 Signal Processing and Radar Measurements.- 9.6.6 Telecommunications.- 9.6.7 Industrial and Other Real-World Measurements.- 9.6.8 Process Control.- 9.6.9 Robotics.- 9.6.10 Chemistry.- 9.6.11 Physics.- 9.6.12 Electronic-Circuit Design.- 9.6.13 Medical Applications Without Image Processing.- 9.6.14 Data Processing.- 9.6.15 Linguistic and AI Problems.- 9.6.16 Mathematical Problems.- 9.6.17 Neurophysiological Research.- 9.7 Applications of LVQ.- 9.8 Survey of SOM and LVQ Implementations.- 9.9 New References in the Second Edition.- 9.9.1 Theory of the SOM.- 9.9.2 Hybridization of the SOM with Other Neural Networks.- 9.9.3 Learning Vector Quantization.- 9.9.4 Practical Applications.- 10. Glossary of “Neural” Terms.- References.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.