Sethi / Jain BTech | Artificial Neural Networks and Statistical Pattern Recognition | E-Book | sack.de
E-Book

E-Book, Englisch, Band Volume 11, 271 Seiten, Web PDF

Reihe: Machine Intelligence and Pattern Recognition

Sethi / Jain BTech Artificial Neural Networks and Statistical Pattern Recognition

Old and New Connections
1. Auflage 2014
ISBN: 978-1-4832-9787-3
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark

Old and New Connections

E-Book, Englisch, Band Volume 11, 271 Seiten, Web PDF

Reihe: Machine Intelligence and Pattern Recognition

ISBN: 978-1-4832-9787-3
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark



With the growing complexity of pattern recognition related problems being solved using Artificial Neural Networks, many ANN researchers are grappling with design issues such as the size of the network, the number of training patterns, and performance assessment and bounds. These researchers are continually rediscovering that many learning procedures lack the scaling property; the procedures simply fail, or yield unsatisfactory results when applied to problems of bigger size. Phenomena like these are very familiar to researchers in statistical pattern recognition (SPR), where the curse of dimensionality is a well-known dilemma. Issues related to the training and test sample sizes, feature space dimensionality, and the discriminatory power of different classifier types have all been extensively studied in the SPR literature. It appears however that many ANN researchers looking at pattern recognition problems are not aware of the ties between their field and SPR, and are therefore unable to successfully exploit work that has already been done in SPR. Similarly, many pattern recognition and computer vision researchers do not realize the potential of the ANN approach to solve problems such as feature extraction, segmentation, and object recognition. The present volume is designed as a contribution to the greater interaction between the ANN and SPR research communities.

Sethi / Jain BTech Artificial Neural Networks and Statistical Pattern Recognition jetzt bestellen!

Weitere Infos & Material


1;Front Cover;1
2;Artificial Neural Networks and Statistical Pattern Recognition: Old and New Connections;4
3;Copyright Page;5
4;Table of Contents;14
5;FOREWORD;6
6;PREFACE;10
7;PART 1: ANN AND SPR RELATIONSHIP;16
7.1;CHAPTER 1. EVALUATION OF A CLASS OF PATTERN-RECOGNITION NETWORKS;16
7.1.1;INTRODUCTION;16
7.1.2;1. A CLASS OF PATTERN-RECOGNITION NETWORKS;16
7.1.3;2. A REPRESENTATION OF THE JOINT DISTRIBUTION;18
7.1.4;3. A CLASS OF CLASSIFICATION FUNCTIONS;19
7.1.5;4. DETERMINATION OF COEFFICIENTS FROM SAMPLES;23
7.1.6;5. SOME COMMENTS ON COMPARING DESIGN PROCEDURES;23
7.1.7;6. SOME COMMENTS ON THE CHOICE OF OBSERVABLES, AND ON INVARIANCE PROPERTIES;24
7.1.8;ACKNOWLEDGMENT;24
7.1.9;REFERENCES;25
7.2;CHAPTER 2. LINKS BETWEEN ARTIFICIAL NEURAL NETWORKS (ANN) AND STATISTICAL PATTERN RECOGNITION;26
7.2.1;1. Overview;26
7.2.2;2. Neural Networks and Pattern Recognition – Generalities;26
7.2.3;3. Some Examples of ANN Paradigms;30
7.2.4;4. Dynamic Systems and Control;41
7.2.5;5. Conclusions;42
7.2.6;REFERENCES;43
7.3;CHAPTER 3. Small sample size problems in designing artificial neural networks;48
7.3.1;Abstract;48
7.3.2;1. INTRODUCTION;48
7.3.3;2. FINITE SAMPLE PROBLEMS IN STATISTICAL PATTERN RECOGNITION;50
7.3.4;3. THE CLASSIFICATION ACCURACY AND TRAINING TIME OF ARTIFICIAL NEURAL NETWORKS;55
7.3.5;4. ESTIMATION OF THE CLASSIFICATION ERROR;58
7.3.6;5. PEAKING IN THE CLASSIFICATION PERFORMANCE WITH INCREASE IN DIMENSIONALITY;59
7.3.7;6. EFFECT OF THE NUMBER OF NEURONS IN THE HIDDEN LAYERON THE PERFORMANCE OF ANN CLASSIFIERS;61
7.3.8;7. DISCUSSION;61
7.3.9;References;62
7.4;CHAPTER 4. On Tree Structured Classifiers;66
7.4.1;Abstract;66
7.4.2;1. INTRODUCTION;66
7.4.3;2. DECISION RULES AND CLASSIFICATION TREES;72
7.4.4;3. CLASSIFICATION TREE CONSTRUCTION AND ERROR RATE ESTIMATION;74
7.4.5;4. TREE PRUNING ALGORITHMS;79
7.4.6;5. EXPERIMENTAL RESULTS;81
7.4.7;6. CONCLUSION;84
7.4.8;REFERENCES;84
7.5;CHAPTER 5. Decision tree performance enhancement using an artificial neural network implementation;86
7.5.1;Abstract;86
7.5.2;1. INTRODUCTION;86
7.5.3;2. DECISION TREE CLASSIFIER ISSUES;87
7.5.4;3. MULTILAYER PERCEPTRON NETWORKS;92
7.5.5;4. AN MLP IMPLEMENTATION OF TREE CLASSIFIERS;94
7.5.6;5. TRAINING THE TREE MAPPED NETWORK;95
7.5.7;6. PERFORMANCE EVALUATION;96
7.5.8;7. CONCLUSIONS;100
7.5.9;REFERENCES;101
8;PART 2: APPLICATIONS;104
8.1;CHAPTER 6. Bayesian and neural network pattern recognition : atheoretical connection and empirical results with handwritten characters;104
8.1.1;Abstract;104
8.1.2;1 Introduction;104
8.1.3;2 Bayes Classifier;105
8.1.4;3 Artificial Neural Networks and Back Propagation;109
8.1.5;4 Relationship;110
8.1.6;5 Experimental Results;114
8.1.7;6 Discussion;116
8.1.8;7 Conclusion;118
8.1.9;8 Acknowledgements;118
8.1.10;References;118
8.2;CHAPTER 7. Shape and Texture Recognition by a Neural Network;124
8.2.1;1. INTRODUCTION;124
8.2.2;2. ZERNIKE MOMENT FEATURES FOR SHAPE RECOGNITION;126
8.2.3;3. RANDOM FIELD FEATURES FOR TEXTURE RECOGNITION;129
8.2.4;4. MULTI-LAYER PERCEPTRON CLASSIFIER;132
8.2.5;5. CONVENTIONAL STATISTICAL CLASSIFIERS;134
8.2.6;6. EXPERIMENTAL STUDY ON SHAPE CLASSIFICATION;135
8.2.7;7. EXPERIMENTAL STUDY ON TEXTURE CLASSIFICATION;141
8.2.8;8. DISCUSSIONS AND CONCLUSIONS;143
8.2.9;9. REFERENCES;145
8.3;CHAPTER 8. Neural Networks for Textured Image Processing;148
8.3.1;Abstract;148
8.3.2;1. INTRODUCTION;148
8.3.3;2. DETECTION OF EDGES IN COMPUTER AND HUMAN VISION;151
8.3.4;3. TEXTURE ANALYSIS USING MULTIPLE CHANNEL FILTERS;153
8.3.5;4. NEURAL NETWORK APPROACHES;158
8.3.6;5. CONCLUDING REMARKS;167
8.3.7;6. References;167
8.4;CHAPTER 9. Markov Random Fields and Neural Networks with Applications to Early Vision Problems;170
8.4.1;Abstract;170
8.4.2;1 INTRODUCTION;170
8.4.3;2 RELATIONSHIP BETWEEN THE TWO FIELDS;172
8.4.4;3 APPLICATIONS;175
8.4.5;4 CONCLUSIONS;187
8.4.6;References;188
8.5;CHAPTER 10. Connectionist Models and their Application to Automatic Speech Recognition;190
8.5.1;Abstract;190
8.5.2;1. INTRODUCTION;190
8.5.3;2. USE OF A-PRIORI KNOWLEDGE;191
8.5.4;3. RECURRENT BACK-PROPAGATION NETWORKS;193
8.5.5;4. FAST IMPLEMENTATION OF BP;197
8.5.6;5. RADIAL BASIS FUNCTIONS MODELS;199
8.5.7;6. COMBINING LOCAL AND DISTRIBUTED REPRESENTATIONS;205
8.5.8;7. CONCLUSION;206
8.5.9;REFERENCES;207
9;PART 3: IMPLEMENTATION ASPECTS;210
9.1;CHAPTER 11. DYNAMIC ASSOCIATIVE MEMORIES;210
9.1.1;1. INTRODUCTION;210
9.1.2;2 . DAM ARCHITECTURES AND GENERAL MEMORY DYNAMICS;211
9.1.3;3. CHARACTERISTICS OF A HIGH-PERFORMANCE DAM;215
9.1.4;4. ASSOCIATIVE LEARNING IN A DAM;217
9.1.5;5. RECORDING STRATEGIES;225
9.1.6;6. DAM CAPACITY AND PERFORMANCE;226
9.1.7;REFERENCES;231
9.2;CHAPTER 12. Optical Associative Memories;234
9.2.1;Abstract;234
9.2.2;1 INTRODUCTION;234
9.2.3;2 BASICS OF ASSOCIATIVE MEMORIES;235
9.2.4;3 FOUR ASSOCIATIVE MEMORY MODELS;236
9.2.5;4 OUTER PRODUCT ASSOCIATIVE MEMORIES;239
9.2.6;5 INNER PRODUCT ASSOCIATIVE MEMORIES;241
9.2.7;6 HOLOGRAPHIC ASSOCIATIVE MEMORIES;242
9.2.8;7 A COMPARISON OF FOUR ASSOCIATIVE MEMORY MODELS;244
9.2.9;8 CONCLUSIONS;251
9.2.10;Acknowledgements;251
9.2.11;REFERENCES;251
9.3;CHAPTER 13. ARTIFICIAL NEURAL NETS IN MOS SILICON;258
9.3.1;1. INTRODUCTION;258
9.3.2;2. A DIGITAL CMOS VLSI IMPLEMENTATION OF A 4-NEURON CIRCUIT;261
9.3.3;3. ANNs USING ANALOG VECTOR MULTIPLIER BASIC CELLS;264
9.3.4;4. ANALOG VLSI IMPLEMENTATION OF SYNAPTIC WEIGHTS VIA SIMPLE MOSFETs;276
9.3.5;5. CONCLUSIONS;280
9.3.6;REFERENCES;282
10;AUTHOR INDEX;286



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.