Mäkisara / Simula / Kangas | Artificial Neural Networks | E-Book | www2.sack.de
E-Book

E-Book, Englisch, 836 Seiten, Web PDF

Mäkisara / Simula / Kangas Artificial Neural Networks


1. Auflage 2014
ISBN: 978-1-4832-9800-9
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, 836 Seiten, Web PDF

ISBN: 978-1-4832-9800-9
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark



This two-volume proceedings compiles a selection of research papers presented at the ICANN-91. The scope of the volumes is interdisciplinary, ranging from mathematics and engineering to cognitive sciences and biology. European research is well represented. Volume 1 contains all the orally presented papers, including both invited talks and submitted papers. Volume 2 contains the plenary talks and the poster presentations.

Mäkisara / Simula / Kangas Artificial Neural Networks jetzt bestellen!

Weitere Infos & Material


1;Front Cover;1
2;Artificial Neural Networks;4
3;Copyright Page;5
4;Table of Contents;8
5;Part 1: Plenary Talks;22
5.1;CHAPTER 1. SELF-ORGANIZING MAPS: OPTIMIZATION APPROACHES;24
5.1.1;1. Introduction;24
5.1.2;2. Relation of the Self-Organizing Map Algorithm to Stochastic Approximation;26
5.1.3;3. Special Case: Optimal Recursive Expression for the Classical Vector Quantization;27
5.1.4;4. An Attempt to Derive the Self-Organizing Map Algorithm from an Error Functional;27
5.1.5;5. Numerical Simulations;29
5.1.6;6. Conclusions;30
5.1.7;References;31
5.1.8;Appendix;32
5.1.9;Acknowledgement;33
5.2;CHAPTER 2. CONNECTIONISM OR WEIGHTLESS NEUROCOMPUTING?;34
5.2.1;1. INTRODUCTION;34
5.2.2;2. CONNECTIONIST AND WEIGHTLESS FRAMEWORKS;34
5.2.3;3. THE GENERALISING RAM MODEL (G-RAM);37
5.2.4;4. THE GENERAL NEURAL UNIT (GNU);38
5.2.5;5. WEIGHTLESS NEUROCOMPUTING FUTURE RESEARCH ISSUES;40
5.2.6;6. SUMMARY;42
5.2.7;REFERENCES;43
6;Part 2: Mathematical Theories of Networks and Dynamical Systems;44
6.1;CHAPTER 3. PROBABILISTIC APPROACH FOR MULTICLASSCLASSIFICATION WITH NEURAL NETWORKS;46
6.1.1;1 Introduction;46
6.1.2;2 Proposed approach;47
6.1.3;3 Practical Reconstruction;48
6.1.4;4 Conclusions;49
6.1.5;References;49
6.2;CHAPTER 4. ON THE FEEDBACK PERCEPTRON;50
6.2.1;1. INTRODUCTION;50
6.2.2;2. STABILITY;51
6.2.3;3. SIMULATION EXAMPLE AND CONCLUSIONS;52
6.2.4;REFERENCES;53
6.3;CHAPTER 5. MATHEMATICAL ASPECTS OF NEURO-DYNAMICSFOR COMBINATORIAL OPTIMIZATION;54
6.3.1;1. INTRODUCTION;54
6.3.2;2. OBJECTIVE FUNCTION OF QUADRATIC FORM;54
6.3.3;3. DYNAMICAL SYSTEM FOR OPTIMIZATION PROBLEMS;55
6.3.4;4. INITIAL STATE SELECTION PROBLEM;56
6.3.5;5. CONCLUDING REMARKS;57
6.3.6;References;57
6.4;CHAPTER 6. THE QUANTITATIVE DESCRIPTION OF PLN NETWORK'S BEHAVIOR;58
6.4.1;1. Introduction;58
6.4.2;2. PLN Network;58
6.4.3;3. The Convergence Theorem of PLN Networks;59
6.4.4;References;61
6.5;CHAPTER 7. PREDICTING THE ANNEALING RANGE BY COMPUTING CRITICAL TEMPERATUREIN MEAN FIELD ANNEALING FOR THE TRAVELING SALESMAN PROBLEM;62
6.5.1;Abstract;62
6.5.2;Introduction;62
6.5.3;Experimental Results and Conclusions;65
6.5.4;References;67
6.6;CHAPTER 8. ORDERS OF APPROXIMATION OF NEURAL NETWORKS TO BRAINSTRUCTURE: LEVELS, MODULES AND COMPUTING POWER;68
6.6.1;1. INTRODUCTION;68
6.6.2;2. ELEMENTS AND STRUCTURES;68
6.6.3;3. COMPUTATIONAL POWER;70
6.6.4;4. CONCLUSION;71
6.6.5;5. REFERENCES;71
6.7;CHAPTER 9. A Provably Convergent Perceptron-Like Algorithmfor Learning Hypercubic Decision Regions;72
6.7.1;1 Introduction;72
6.7.2;2 Notation;72
6.7.3;3 Algorithm Statement;73
6.7.4;4 Stability Proof;73
6.7.5;5 Implications of the stability;74
6.7.6;6 Conclusions;75
6.7.7;Acknowlegdement;75
6.7.8;7 References;75
6.8;CHAPTER 10. Perceptron Learning with Reasonable Distributions of Examples;76
6.8.1;1 Introduction;76
6.8.2;2 PAC-Learnability;76
6.8.3;3 Learning with reasonable distributions;77
6.8.4;4 Perceptron learning;77
6.8.5;5 Directed search perceptron algorithms;78
6.8.6;Acknowledgements;79
6.8.7;References;79
6.9;CHAPTER 11. GLOBAL OPTIMIZATION BY REACTION-DIFFUSION SYSTEMS;80
6.10;CHAPTER 12. SELF-ORGANIZING MAP TRAINING USING DYNAMIC K-D TREES;84
6.10.1;1 Introduction;84
6.10.2;2 Training of Self-Organizing Feature Maps;84
6.10.3;3 The Dynamic K-d Tree;85
6.10.4;4 Experimental Results;85
6.10.5;5 Conclusions;87
6.10.6;References;87
6.11;CHAPTER 13. NETWORK CONFIGURATION AND INITIALIZATION USING MATHEMATICAL MORPHOLOGY :THEORETICAL STUDY OF MEASUREMENT FUNCTIONS;88
6.11.1;1 Introduction;88
6.11.2;2 Network equivalence;88
6.11.3;3 Dimension of the space of transformed images;89
6.11.4;4 Optimal size of neighborhood;90
6.11.5;5 Results on morphological measurements;90
6.11.6;6 Conclusion;91
6.11.7;7 Notations;91
6.11.8;References;91
6.12;CHAPTER 14. A GLOBAL APPROACH TO CLASSIFICATION: PROBABILISTIC ASPECTS;92
6.12.1;1 CLASSIFICATION;92
6.12.2;2 PROBABILISTIC APPROACH;92
6.12.3;3 PROPOSED METHODOLOGY;94
6.12.4;References;95
6.13;CHAPTER 15. A NETWORK FOR DISCRIMINANT ANALYSIS;96
6.13.1;ABSTRACT;96
6.13.2;1. Introduction;96
6.13.3;2. Network Model and Training;97
6.13.4;3 . Discussion;99
6.13.5;ACKNOWLEDGEMENTS;99
6.13.6;REFERENCES;99
6.14;CHAPTER 16. ON THE NEURAL NETS DISCRETENESS ROLE BY THE TREATMENT OF IMAGES;100
6.14.1;REFERENSES;102
6.15;CHAPTER 17. AN INFORMATION THEORETICAL INTERPRETATION OFNEURONAL ACTIVITIES;104
6.15.1;1. Introduction;104
6.15.2;2. The definitions of Eó and Et;104
6.15.3;References;107
7;Part 3: Pattern Recognitionand Signal Processing I;108
7.1;CHAPTER 18. CLASSIFICATION OF REMOTELY-SENSED SATELLITEIMAGES USING MULTI-LAYER PERCEPTRON NETWORKS;110
7.1.1;1. INTRODUCTION;110
7.1.2;2. SATELLITE IMAGERY AND GROUND TRUTH TRAINING DATA;110
7.1.3;3. IMPLEMENTATION OF NEURAL NETWORK CLASSIFIER;111
7.1.4;4. NETWORK ARCHITECTURES AND PERFORMANCE;111
7.1.5;5. DISCUSSION;113
7.1.6;REFERENCES;113
7.2;CHAPTER 19. NEURAL NETWORK FOR DIGITAL IMAGE ENHANCEMENT;114
7.2.1;1. THEORETICAL ASPECTS;114
7.2.2;2. REALTIME HISTOGRAM MODIFICATION;115
7.2.3;3. CONCLUSION;117
7.2.4;REFERENCES;117
7.3;CHAPTER 20. IMAGE PROCESSING BY NEURAL NETWORK IMPLEMENTATION OF 2 - D;118
7.3.1;1. Introduction;118
7.3.2;2. Problem statement;118
7.3.3;3. Problem solution;119
7.3.4;4. Neural network solving Lyapunov equation;119
7.3.5;5.Conclusions;121
7.3.6;References;121
7.4;CHAPTER 21. GAP-REMOVAL IMAGE TRANSFORMATION FOR ALIAS;122
7.4.1;1. INTRODUCTION TO ALIAS;122
7.4.2;2. IMAGE TRANSFORMATION;123
7.4.3;3 . PRELIMINARY RESULTS AND CONCLUSIONS;124
7.4.4;REFERENCES;125
7.5;CHAPTER 22. IMAGE SEGMENTATION USING 4 DIRECTIONLINE-PROCESSES AND WINNER-TAKE-ALL;126
7.5.1;1 Introduction;126
7.5.2;2 Mean field techniques and winner-take-all:;126
7.5.3;3 Four line-process model :;127
7.5.4;4 Simulation;128
7.5.5;References;128
7.6;CHAPTER 23. CONSTRAINT SATISFACTION NEURAL NETWORKS FOR IMAGE SEGMENTATION;130
7.6.1;1. INTRODUCTION;130
7.6.2;2. CONSTRAINT SATISFACTION NEURAL NETWORKS;130
7.6.3;3. EXPERIMENTAL RESULTS;132
7.6.4;4. CONCLUDING REMARKS;132
7.6.5;REFERENCES;132
7.7;CHAPTER 24. A CUT-POINT RECOGNITION ALGORITHM USING PLN NODE;134
7.7.1;1. INTRODUCTION;134
7.7.2;2. THE CUT-POINT RECOGNITION METHOD;135
7.7.3;3. DISCUSSIONS AND CONCLUSIONS;136
7.7.4;REFERENCES;137
7.8;CHAPTER 25. A NOVEL CHARACTER RECOGNITION SYSTEM USING ACONTEXTUAL FEEDBACK CONNECTIONIST MODULE TO ENHANCE SYSTEM PERFORMANCE;138
7.8.1;1. INTRODUCTION;138
7.8.2;2. SYSTEM DESIGN;138
7.8.3;3. RESULTS AND DISCUSSION;140
7.8.4;4. CONCLUSION;141
7.8.5;Acknowledgements:;141
7.8.6;References;141
7.9;CHAPTER 26. NN AND HEURISTIC APPROACH TO CHARACTER RECOGNITION;142
7.9.1;1. INTRODUCTION;142
7.9.2;2. THE NEURAL NETWORK;142
7.9.3;3. THE COOPERATION BETWEEN THE NN AND AN HEURISTIC APPROACH;143
7.9.4;4. CONCLUSION;145
7.9.5;REFERENCES;145
7.10;CHAPTER 27. SIMILARITY-INVARIANT RECOGNITION OF VISUAL IMAGES WITHHELP OF KOHONENfS MAPPING FORMATION ALGORITHM;146
7.10.1;1 . INTRODUCTION;146
7.10.2;2. THE MAIN THEOREM;147
7.10.3;3. RECOGNITION CONSIDERATIONS;148
7.10.4;4 . GENERALIZATIONS;148
7.10.5;REFERENCES;149
7.11;CHAPTER 28. ACQUIRED STRUCTURE, ADAPTED PARAMETERS :MODIFICATIONS OF THE NEOCOGNITRON;150
7.11.1;1. INTRODUCTION;150
7.11.2;2. OUTLINE OF THE NEOCOGNITRON;150
7.11.3;3. OPTIMIZATION;151
7.11.4;4. SELF ORGANISATION AND CONSTRUCTION PROCESS;151
7.11.5;5. SIMULATIONS;152
7.11.6;6. THRESHOLD CONTROL;152
7.11.7;7. CONCLUSION;153
7.12;CHAPTER 29. MODEL-BASED OBJECT RECOGNITION USING ARTIFICIAL NEURAL NETWORKS;154
7.12.1;1. INTRODUCTION;154
7.12.2;2. INVARIANT BOUNDARY REPRESENTATION;154
7.12.3;3. THE PREDICT BACK-PROPAGATION ALGORITHM;155
7.12.4;4. THE KOHONEN ALGORITHM;155
7.12.5;5. IMPLEMENTATION ISSUES AND SIMULATION RESULTS;155
7.12.6;6. COMPARISONS WITH CLASSICAL METHODS;157
7.12.7;7. CONCLUSIONS;157
7.12.8;REFERENCES;158
7.13;CHAPTER 30. AUTOMATIC CLASSIFICATION OF VISUAL EVOKED POTENTIALS BY FEEDFORWARD NEURAL NETWORKS;160
7.13.1;INTRODUCTION;160
7.13.2;SIGNAL RECORDING;161
7.13.3;VEP CLASSIFICATION BY HUMAN EXPERTS;161
7.13.4;VEP CLASSIFICATION BY NEURAL NETWORKS;162
7.13.5;DISCUSSION;163
7.13.6;AKNOWLEDGMENTS;163
7.13.7;REFERENCES;163
7.14;CHAPTER 31. A NEURAL NETWORK MODEL FOR CONTROL AND STABILIZATION OF REVERBERATING PATTERN SEQUENCES;164
7.14.1;Introduction;164
7.14.2;The model approach off the dynamic recurrent filter;164
7.14.3;Week context dependent on-Iine-moduIation of the processing structure of the recurrent dynamic filter;165
7.14.4;Simulation results;166
7.15;CHAPTER 32. RESULTS OBTAINED WITH THE AUTOGENERATIVE NODALMEMORY (ANM) MODEL NEURAL NETWORK;168
7.15.1;1. ABSTRACT;168
7.15.2;2. ILLUSORY CONTOURS (THE EFFECT);168
7.15.3;3. THE IMPLEMENTATION WITH ANM;169
7.15.4;4. ANM EXPERIMENT RESULT;170
7.15.5;5. CONCLUSIONS;171
7.15.6;6. REFERENCES;171
8;Part 4: Physics Connection;172
8.1;CHAPTER 33. LEARNING TOPOLOGICAL MAPPINGS FOR SKELETAL REPRESENTATION;174
8.1.1;1. INTRODUCTION: TOPOLOGICAL MAPPINGS;174
8.1.2;2. LEARNING TOPOLOGICAL MAPPINGS FOR SKELETAL REPRESENTATION;175
8.1.3;3. PATH PLANNING;175
8.1.4;4. SIMULATION RESULTS;175
8.1.5;5. CONCLUSION;175
8.1.6;ACKNOWLEDGEMENTS;177
8.1.7;REFERENCES;177
8.2;CHAPTER 34. Basins of Attraction in Neural Network ModelsTrained with External Fields;178
8.2.1;Introduction;178
8.2.2;The Model;178
8.2.3;Fixed-Points of the Dynamics;179
8.2.4;Training Field Only;180
8.2.5;Retrieval Field Only;180
8.2.6;Equal Training and Retrieval Fields;180
8.2.7;Concluding Remarks;181
8.2.8;Acknowledgements;181
8.2.9;References;181
9;Part 5: Neural Network Architectures and Algorithms I;182
9.1;CHAPTER 35. FAST LEARNING ALGORITHMS FOR NEURAL NETWORKS;184
9.1.1;1. Introduction;184
9.1.2;2. A Generalized Criterion for the Training of Neural Networks;184
9.1.3;3. Fast Learning Algorithms for Single-Layered Neural Networks;184
9.1.4;4. Fast Learning Algorithms for Multi-layered Neural Networks;185
9.1.5;5. Experimental Results;186
9.1.6;6. Conclusions;187
9.1.7;REFERENCES;187
9.2;CHAPTER 36. UN NEURAL NETWORK ALGORITHM FOR GRAPH MATCHING;188
9.2.1;References;191
9.3;CHAPTER 37. Recurrence with Delayed Links in Multilayer Networksfor Processing Sequential Data;192
9.3.1;1 Introduction;192
9.3.2;2 Approaches to sequentialdata processing;193
9.3.3;3 Networks described with delayed links;193
9.3.4;4 The learning algorithm;194
9.3.5;5 Simulation experiments;194
9.3.6;6 Discussion;195
9.3.7;7 References;195
9.4;CHAPTER 38. DYNAMICALLY CAPACITY ALLOCATING NETWORK MODELSFOR CONTINUOUS LEARNING;196
9.4.1;1. INTRODUCTION;196
9.4.2;2. LEARNING ALGORITHM;197
9.4.3;3. EXAMPLE;199
9.4.4;4. CONCLUSIONS;199
9.4.5;REFERENCES;199
9.5;CHAPTER 39. TOWARDS OPTIMAL ARCHITECTURES FOR LOGICAL NEURALNETS;200
9.5.1;1. Introduction;200
9.5.2;2.The PLN Model;201
9.5.3;3.Training and Construction;201
9.5.4;4. Computer Simulations;202
9.5.5;5. Conclusions;203
9.5.6;Acknowledgements;203
9.5.7;References;203
9.6;CHAPTER 40. PREDICTION AND GENERALIZATION IN LOGICAL NEURAL NETS;204
9.6.1;I. Introduction;204
9.6.2;II. Probability Transfer Model of G-RAM;204
9.6.3;III. Prediction and Generalization;205
9.6.4;IV. Computer Simulations;206
9.6.5;V. Conclusions;207
9.6.6;Acknowledgements;207
9.6.7;References;207
9.7;CHAPTER 41. FUSION-TECHNOLOGY AND THE DESIGN OF EVOLUTIONARY MACHINES FOR NEURAL NETWORKS;208
9.7.1;Motivation and Background;208
9.7.2;Central Features of an Evolutionary Machine;209
9.7.3;Recombination;210
9.7.4;Summary;211
9.7.5;References;211
9.8;CHAPTER 42. Synaptic Growth As A Learning Model;212
9.8.1;1. INTRODUCTION;212
9.8.2;2. SGN NETWORK BASICS;213
9.8.3;3. SGN MODELS I, II AND III;213
9.8.4;4. CONCLUSION;215
9.8.5;REFERENCES;215
9.9;CHAPTER 43. LEARNING AND GENERALIZATIONIN ADAPTIVE LOGIC NET WORKS 1;216
9.9.1;1 Introduction;216
9.9.2;2 Learning and generalization in logic networks.;216
9.9.3;3 Dealing with continuous inputs;218
9.9.4;4 What advantage is there to using logic networks?;218
9.9.5;5 Conclusions;219
9.9.6;6 References;219
9.10;CHAPTER 44. A COMPARISON BETWEEN REAL AND COMPLEX VALUED NEURALNET WORKS IN COMMUNICATION APPLICATIONS;220
9.10.1;Abstract;220
9.10.2;1. Introduction;220
9.10.3;2. Complex-Valued Multi-Layer Perceptron;221
9.10.4;3. Experimental Results;222
9.10.5;4. References;222
9.11;CHAPTER 45. SDNN: AN O(l) PARALLEL PROCESSING WITH STRICTLY DIGITALNEURAL NETWORKS FOR COMBINATORIAL OPTIMIZATIONIN LARGE SCALE N-QUEEN PROBLEM;224
9.11.1;1. Introduction;224
9.11.2;2. The Systematic Design of Hopfield Neural Networks using the "k -out-of-n" Design Rule;224
9.11.3;3. Computation Model of Strictly DigitalNeural Network;225
9.11.4;4. Performance evaluation of SDNN in Large-Scale Problems;226
9.11.5;References;227
9.12;CHAPTER 46. TOLERANCE OF A BINARY ASSOCIATIVE MEMORYTOWARDS STUCK-AT-FAULTS;238
9.12.1;1. Introduction;238
9.12.2;2. The Associative Matrix Concept;238
9.12.3;3. Stuck-at-1;239
9.12.4;4. Stuck-it-0;240
9.12.5;5. Conclusion;241
9.12.6;Acknowledgements;241
9.12.7;Literature;241
9.13;CHAPTER 47. MULTI-FONT CHINESE CHARACTER RECOGNITION WITH ASSOCIATIVE MEMORY NETWORK;242
9.13.1;Abstract;242
9.13.2;1 INTRODUCTION;242
9.13.3;2 THE MECHANISM OF ASSOCIATIVE MEMORY;242
9.13.4;3 INNER CODE SELECTION;243
9.13.5;4 EXPERIMENTS AND RESULTS;244
9.13.6;5 CONCLUSIONS;245
9.13.7;ACKNOWLEDGEMENTS;245
9.13.8;REFERENCES;245
9.14;CHAPTER 48. MAGE RECOGNITION IN HYPERCOLUMNAR SCALE SPACE BY SPARSELY CODED ASSOCIATIVE MEMORY;246
9.14.1;1 Introduction;246
9.14.2;2 A Dynamic Approach to Hypercolumnar Interactions;246
9.14.3;3 Recognition by Sparsely Coded Associative Memory;247
9.14.4;4 Scale Space Searching for Translational Invariance;248
9.14.5;5 Implementation and Results;249
9.14.6;References;249
9.15;CHAPTER 49. DESIGN IMPROVEMENTS IN ASSOCIATIVE MEMORIES FOR CEREBELLAR MODEL ARTICULATION CONTROLLERS (CMAC);250
9.15.1;Abstract;250
9.15.2;1. INTRODUCTION;250
9.15.3;2. THE RECEPTIVE CENTER PLACEMENT PROBLEM;251
9.15.4;3. EXPERIMENTAL EVALUATION OF RECEPTIVE FIELD SHAPES;252
9.15.5;4. USE OF SUPERSPHERES;252
9.15.6;5 . SPEEDING UP CONVERGENCE OF THE LEARNING ALGORITHM;253
9.15.7;6 . REFERENCES;253
9.16;CHAPTER 50. Implementing a "Sense of Time" via Entropy in Associative Memories;254
9.16.1;Abstract;254
9.16.2;Introduction;254
9.16.3;Acknowledgements;255
9.16.4;The Temporal Model;255
9.16.5;Obtaining the Desired Properties;256
9.16.6;Implementations;257
9.16.7;Further Work;257
9.16.8;Summary;257
9.16.9;Bibliography;257
9.17;CHAPTER 51. Paging Associative Memories;258
9.17.1;1 Introduction;258
9.17.2;2 The "Paging" viewpoint;258
9.17.3;3 The "Serial Processing" viewpoint;259
9.17.4;4 The "Merged" viewpoint;259
9.17.5;5 The Context-Sensitive Paradigm;260
9.17.6;6 Conclusion;261
9.18;CHAPTER 52. On Finding Approximate Solutions to HardProblems by Neural Networks;262
9.18.1;Abstract;262
9.18.2;1 Introduction;262
9.18.3;2 The Neural Network Model;262
9.18.4;3 Preliminaries;263
9.18.5;4 Finding Approximate Solutions by Neural Networks;264
9.18.6;5 Concluding Remarks;265
9.18.7;References;265
10;Part 6: Artificial Associative Memories;228
10.1;CHAPTER 53. STABILITY RESULTS OF A CLASS OF CONTINUOUS ASSOCIATIVE MEMORIES WITH HIGH-CAPACITY;230
10.2;1. Introduction;230
10.3;2. Continuous Recurrent Correlation Associative Memories;230
10.4;3. Stability Properties of the CRCAM;232
10.5;References;233
10.6;CHAPTER 54. ON THE RETRIEVAL IN HOPFIELD NETS WITH A FINITE RANGE OFCONNECTIONS;234
10.7;1. INTRODUCTION;234
10.8;2. NUMERICAL TREATMENT AND RESULTS;235
10.9;REFERENCES;237
11;Part 7: Robotics and Control;266
11.1;CHAPTER 55. MULTI-LAYER PERCEPTRON LEARNING FOR DESIGN PROBLEM SOLVING;268
11.1.1;1. A TWO-LAYER NEURAL NETWORK MODEL;268
11.1.2;2. APPLICATION;270
11.1.3;ACKNOWLEDGEMENT;271
11.1.4;REFERENCE;271
11.2;CHAPTER 56. PROCESS ERROR DETECTIONUSING SELF-ORGANIZING FEATURE MAPS;272
11.2.1;1 Introduction;272
11.2.2;2 Experimental Set-up;272
11.2.3;3 Monitoring the Process with Self-organizing Feature Maps;273
11.2.4;4 Results;274
11.2.5;5 Conclusion and Further Work;275
11.2.6;References;275
11.3;CHAPTER 57. Using Inverse Perspective Mapping as a Basis for two Concurrent Obstacle Avoidance Schemes;276
11.3.1;1 INTRODUCTION;276
11.3.2;2 MONOCULAR APPROACH: OBSTACLE DETECTION BY CORRELATION-BASED OPTICAL FLOW ANALYSIS;277
11.3.3;3 STEREOPTICAL APPROACH: COMPARINGAN IMAGE PAIR WITHIN A COMMON VIEW BY SIMPLE CORRELATION;278
11.3.4;4 CONCLUSIONS;279
11.3.5;References;279
11.4;CHAPTER 58. A BIOLOGICAL MOTIVATED SYSTEM TO TRACK MOVING OBJECTSBY ACTIVE CAMERA CONTROL;280
11.4.1;1 Introduction and Motivation;280
11.4.2;2 Biological Model;280
11.4.3;3 The Behavioural Module "Tracking";281
11.4.4;4 Experimental Results;282
11.4.5;5 Discussion and Future Work;282
11.4.6;References;283
11.5;CHAPTER 59. NEURALLY INSPIRED ASSOCIATIVE MEMORIES FOR LEARNINGCONTROL. A COMPARISON;284
11.5.1;1 Introduction;284
11.5.2;2 Classical Methods;284
11.5.3;3 Neurally Inspired Methods;285
11.5.4;4 Comparison of AMS and a Backpropagation Network;286
11.5.5;5 Conclusion;287
11.5.6;Acknowledgements;287
11.5.7;References;287
11.6;CHAPTER 60.
FUZZY ASSOCIATIVE MEMORY APPLICATION TO A PLANT MODELING;288
11.6.1;1. INTRODUCTION;288
11.6.2;2. FUZZY COGNITIVE MODEL CONFIGURATION;288
11.6.3;3. APPLICATION;290
11.6.4;4. CONCLUSION;291
11.6.5;REFERENCES;291
11.7;CHAPTER 61. FUZZY ASSOCIATIVE MEMORY APPLICATIONS TO CONTROL;292
11.7.1;1. INTRODUCTION;292
11.7.2;2. FUZZY ASSOCIATIVE MEMORY SYSTEM CONSTRUCTION;292
11.7.3;3. APPLICATIONS;294
11.7.4;4. CONCLUSION;295
11.7.5;REFERENCES;295
11.8;CHAPTER 62. A FAST EDGE DETECTION METHOD FOR NAVIGATION;296
11.8.1;1. Introduction;296
11.8.2;2. Detection and classification of road edges;296
11.8.3;3. Results;298
11.8.4;References:;299
11.9;CHAPTER 63. QUADRUPEDAL WALKING USING TRAINED AND UNTRAINED NEURAL MODELS;300
11.9.1;1 Introduction;300
11.9.2;3 The hybrid search / learning system;301
11.9.3;5 Extension of the trials;303
11.9.4;References;303
11.10;CHAPTER 65. THE BLIND NEURAL NETWORK MAKER: CAN WE USE CONSTRAINED EMBRYOLOGIES TO DESIGN ANIMAT NERVOUS SYSTEMS?;304
11.10.1;1. INTRODUCTION;304
11.10.2;2. SYMMETRY;305
11.10.3;3. SEGMENTATION;306
11.10.4;4. RECURSION;307
11.10.5;ACKNOWLEDGMENTS;307
11.10.6;REFERENCES;307
11.11;CHAPTER 66. Path Finding with Nonlinear Waves;308
11.11.1;Introduction;308
11.11.2;Wave propagation in a network of oscillators;309
11.11.3;Path finding with nonlinear waves;309
11.11.4;Discussion;311
11.11.5;References;311
11.12;CHAPTER 67. LIZZYThe Genetic Programming of an Artificial Nervous System Hugo de Garis;312
11.12.1;Abstract;312
11.12.2;1. Introduction;312
11.12.3;2. Genetic Programming of GenNets;313
11.12.4;3 . Building an Artificial Nervous System : The LIZZY Project;313
11.12.5;4. The LIZZY Circuit;314
11.13;CHAPTER 68. GENETICALLY PROGRAMMED NEURAL NETWORK FOR SOLVING POLE-BALANCING PROBLEM;316
11.13.1;1. INTRODUCTION;316
11.13.2;2. PROBLEM;316
11.13.3;3. METHOD;317
11.13.4;4. RESULTS;318
11.13.5;5. COMMENT;319
11.13.6;REFERENCES;319
11.14;CHAPTER 69. NEURAL NET BASED CONTROL OF THE HEATING PROCESS;320
11.14.1;1. Introduction;320
11.14.2;2. One-step-ahead Controller;320
11.14.3;3. Long Range Predictive Controller (LRPC;321
11.14.4;4. Experimental Setup;322
11.14.5;5. Experiments;322
11.14.6;7. Conclusion;323
11.14.7;REFERENCES;323
11.15;CHAPTER 70. A NEURAL IMPLEMENTATION OF ANALOGIC PLANNING METHODS;324
11.15.1;1. Analogic Planning;324
11.15.2;2. Neural implementation;325
11.15.3;References;326
11.16;CHAPTER 71. USE OF CMAC NEURAL NETWORKS IN REINFORCEMENT SELF-LEARNING CONTROL1;328
11.16.1;ABSTRACT;328
11.16.2;I. INTRODUCTION;328
11.16.3;II. BOX-BASED ADAPTIVE CRITIC LEARNING;328
11.16.4;III. CMAC-BASED ADAPTIVE CRITIC LEARNING;329
11.16.5;IV. SIMULATION RESULTS AND CONCLUSION;330
11.16.6;REFERENCES;331
12;Part 8: Self-Organization and Vector Quantization;332
12.1;CHAPTER 72. A SELF-ORGANIZING ALGORITHM FOR THE COMPUTATIONAL LOAD BALANCE OF ACONCURRENT COMPUTER;334
12.1.1;1. INTRODUCTION;334
12.1.2;2 . THE MULTIPROCESSOR LOAD BALANCING PROBLEM;334
12.1.3;3.THE MULTIPROCESSOR LOAD BALANCING ALGORITHM.;335
12.1.4;4. THE TASK PRESENTATION ORDER;336
12.1.5;5. SIMULATION RESULTS;337
12.1.6;6. CONCLUSIONS;337
12.1.7;7. REFERENCES;337
12.2;CHAPTER 73. AN UNSUPERVISED HYPERSPHERIC MULTI-LAYER FEEDFORWARD NEURAL NETWORK CLASSIFIER;338
12.2.1;ABSTRACT;338
12.2.2;I. INTRODUCTION;338
12.2.3;II. THE CASE OF STRONGLY SEPARABLE CATEGORIES;339
12.2.4;III. THE CASE OF NON-SEPARABLE CATEGORIES;341
12.2.5;IV. SIMULATION RESULTS;341
12.2.6;V. SUMMARY;342
12.2.7;Acknowledgements;342
12.2.8;References;342
12.3;CHAPTER 74. A NOVEL FEATURE MAP ARCHITECTURE FOR THE REPRESENTATION OF CLASSIFIER CONDITION SETS;344
12.3.1;Abstract;344
12.3.2;1. Introduction;344
12.3.3;2. Basic Architecture;344
12.3.4;3. HLS featurt Maps;344
12.3.5;4. Taet Regime;345
12.3.6;5. Conclusion»;346
12.3.7;6. References;346
12.4;CHAPTER 75. SELF ORGANIZING FEATURE MAPS FOR CONTOUR DETECTION IN VIDEOPHONE IMAGES;348
12.4.1;1. INTRODUCTION;348
12.4.2;2. PROBLEM DESCRIPTION;348
12.4.3;3. IMAGE PRE-PROCESSING;349
12.4.4;4. PREVIOUS APPROACHES TO CONTOUR DETECTION;350
12.4.5;5. CONTOUR DETECTION WITH SELF-ORGANIZING MAPS;350
12.4.6;6. EXPERIMENTAL ACTIVITY;350
12.4.7;7. CONCLUSIONS;351
12.4.8;REFERENCES;351
12.5;CHAPTER 76. SELF-ORGANIZING FEATURE MAPS FOR APPRAISALOF LAND VALUE OF SHORE PARCELS;352
12.5.1;1. Introduction;352
12.5.2;2. Self-organizing feature maps;352
12.5.3;3 . Land parcels on the shores of lakes;353
12.5.4;4. Conceptual structure of the shore parcels;354
12.5.5;5. Conclusions;355
12.5.6;References;355
12.6;CHAPTER 77. FINITE ELEMENT MESHING USING KOHONEN'SSELF-ORGANIZING MAPS;356
12.6.1;1. INTRODUCTION;356
12.6.2;2. PRESENTATION OF FINITE ELEMENT MESHING;356
12.6.3;3. NEURAL MESHING;357
12.6.4;4. CONCLUSION;360
12.6.5;REFERENCES;360
12.7;CHAPTER 78. HENAMnet HOMOGENEOUS ENCODINGFOR APPROXIMATION OF MAPPINGS;362
12.7.1;1 Introduction;362
12.7.2;2 Encoding Procedure;362
12.7.3;3 Decoding Procedure;363
12.7.4;4 Connecting different subnets;364
12.7.5;5 Learning;364
12.7.6;6 Simulations;365
12.7.7;7 Conclusions;365
12.7.8;References;365
12.8;CHAPTER 79. SELF-ORGANISING FEATURE MAPS FOR CURSIVE SCRIPT RECOGNITION;366
12.8.1;1. Introduction;366
12.8.2;2. Self-Organising Allographic Networks;366
12.8.3;3. Recognition;367
12.8.4;References;367
12.9;CHAPTER 80. ENHANCED MAPPING : AN EXTENSION OF TOPOLOGICAL MAPPING TO FORM INTERNAL REPRESENTATIONS AND SPATIAL MAPPINGS;370
12.9.1;1/ Introduction;370
12.9.2;2/ Dynamics (Quantitative). Projection Artifacts and Distribution Artifacts;371
12.9.3;3/ TQpQlQgiçal Mapping;373
12.9.4;4/ A cascade of two Enhanced Maps;373
12.9.5;5/ Conclusions;375
12.9.6;Acknowledgement;375
12.9.7;References;375
12.10;CHAPTER 81. DVQ: DYNAMIC VECTOR QUANTIZATION - AN INCREMENTAL LVQ;376
12.10.1;1- INTRODUCTION;376
12.10.2;2- SYNTETIC DATA;376
12.10.3;3- SPEECH DATABASE AND FEATURE EXTRACTION;377
12.10.4;4- CLASSIFIERS;377
12.10.5;5- EXPERIMENTS AND RESULTS;378
12.10.6;6- CONCLUSION;379
12.10.7;References;379
12.11;CHAPTER 82. MONITORING OF INPUT SIGNALS SUBSPACE LOCATION IN SENSORY SPACE BY NEURONET INNER LAYER NEURONS THRESHOLD VALUE ADAPTATION;380
12.11.1;1. SINGLE CORRESPONDANCE REQUIREMENT;380
12.11.2;2. THRESHOLD NEURON MODEL;380
12.11.3;3. ANSWER AREAS IN SENSORY SPACE;381
12.11.4;4. PARALLEL SHIFT OF BOUNDARIES;381
12.11.5;5. INPUT SIGNAL SUBSPACE;382
12.11.6;6. THRESHOLDS ADAPTATION PROCESS;382
12.11.7;7. CONCLUSIONS;383
12.11.8;References:;383
12.12;CHAPTER 83. UNSUPERVISED CLUSTERING OF PROTEINS;384
12.12.1;REFERENCES;387
12.13;CHAPTER 84. An Approach to the Application of Dedicated Neural Network Hardware for Real Time Image Compression;388
12.13.1;1 Introduction;388
12.13.2;2 The binary associative memory;389
12.13.3;3 The Kohonen Feature Map;389
12.13.4;4 The Simulation System;390
12.13.5;5 Simulation Results;390
12.13.6;6 Conclusion and Future Work;391
12.13.7;Acknowledgement;391
12.13.8;References;391
12.14;CHAPTER 85. A SELF-ORGANIZING UPDATING NETWORK;392
12.14.1;1. Introduction;392
12.14.2;2. Scheduling by Edge Reversal ( S E R );392
12.14.3;3. The Theory of Neuronal Group Selection (TNGS);393
12.14.4;4. An Updating Network;394
12.14.5;5. Conclusions;395
12.14.6;References;395
12.15;CHAPTER 86. A LEARNING ALGORITHM WITH MULTIPLE CRITERIAFOR SELF-ORGANIZING FEATURE MAPS;396
12.15.1;1. INTRODUCTION;396
12.15.2;2. ALGORITHM DEVELOPMENT;396
12.15.3;3. EXPERIMENTS AND RESULTS;398
12.15.4;4. CONCLUSION;398
12.15.5;ACKNOWLEDGEMENT;398
12.15.6;REFERENCES;398
12.16;CHAPTER 87. THE HYPERMAP ARCHITECTURE;400
12.16.1;1. Introduction;400
12.16.2;2. The two-phase recognition algorithm;401
12.16.3;3. Application example: recognition of phonemes;403
12.16.4;4. Conclusions;403
12.16.5;References;403
13;Part 9: Neural Knowledge Data Bases and Non-Rule-Based Decision Making;404
13.1;CHAPTER 88. PERFORMANCE EVALUATION OF EXTENDED BACKPROPAGATION RULE FOR GENERATING NETWORKS OF CONNECTIONIST EXPERT SYSTEMS;406
13.1.1;1. INTRODUCTION;406
13.1.2;2. EXTENDED BACKPROPAGATION RULE;406
13.1.3;3. PERFORMANCE EVALUATION OF EXTENDED BACKPROPAGATION WITH DIFFERENT PARAMETERS;407
13.1.4;4 . DISCUSSION AND ANALYSIS OF RESULTS;408
13.1.5;REFERENCES;409
13.2;CHAPTER 89. Concept Randomness and Neural Networks;410
13.2.1;1 Introduction;410
13.2.2;2 Sparse, continuous random problems;411
13.2.3;3 Probabilistic approximations for recognition of SRC concepts;412
13.2.4;4 Random neural networks (RNN);412
13.2.5;References;413
13.3;CHAPTER 90. THE EFFECT OF LOW-LEVEL AIR POLLUTION AND WEATHER ON ASTHMA AND CHRONIC BRONCHITIS PATIENTS, STUDIED BY NEURAL NETWORK METHODS;414
13.3.1;1. PROBLEM AND APPROACH;414
13.3.2;TRAINING OF THE NETWORK;415
13.3.3;RESULTS AND DISCUSSION;416
13.3.4;REFERENCES;417
13.4;CHAPTER 91. A NEURAL NETWORK THAT LEARNS TO DO HYPHENATION;418
13.4.1;Introduction;418
13.4.2;1. The Network Architecture;418
13.4.3;2. Choosing Word Bases;419
13.4.4;3. Learning to Hyphenate the Training Words;419
13.4.5;4. Testing the Network with Unknown Words;420
13.4.6;5. Effects of the Hidden Layer Size;421
13.4.7;6. Conclusions;421
13.4.8;References;421
13.5;CHAPTER 92. PRIME NUMBERS : A WAY TO DISTRIBUTE SYMBOLIC KNOWLEDGE OVERNEURAL NETS;422
13.5.1;1. INTRODUCTION;422
13.5.2;2. PRIME NUMBERS, PROPOSITIONAL LOGIC AND CONNECTIONISM;422
13.5.3;3. MULTIPLE SITE NEURONS FOR DISTRIBUTING SYMBOLIC KNOWLEDGE;424
13.5.4;4. A DISTRIBUTED STRUCTURE FOR A CONNECTIONIST PRODUCTION SYSTEM;424
13.5.5;5.CONCLUSION;425
13.5.6;REFERENCES;425
14;Part 10: Biological and Physiological Connection;426
14.1;CHAPTER 93. PATTERN RECOGNITION OF HOARSE AND HEALTHY VOICES BYTHE SELF-ORGANIZING MAP;428
14.1.1;1. INTRODUCTION;428
14.1.2;2. METHODS;428
14.1.3;3. RESULTS;429
14.1.4;4. DISCUSSION;431
14.1.5;ACKNOWLEDGEMENT;431
14.1.6;REFERENCES;431
14.2;CHAPTER 94. LAYERED SELF-ADAPTIVE NEURAL NETWORK APPROACH TO EARLY VISUAL INFORMATION PROCESSING;432
14.2.1;1. INTRODUCTION;432
14.2.2;2. FRAMEWORK;432
14.2.3;3. RESULTS;433
14.2.4;4. DISCUSSION;435
14.2.5;ACKNOWLEDGEMENTS;435
14.2.6;REFERENCES;435
14.3;CHAPTER 95. A NEURAL NETWORK FOR VISUAL MOTION DETECTIONTHAT CAN EXPLAIN PSYCHOPHYSICAL AND PHYSIOLOGICAL PHENOMENA;436
14.3.1;INTRODUCTION;436
14.3.2;STRUCTURE OF THE MODEL;436
14.3.3;BEHAVIOR OF T H E MODEL;437
14.3.4;SIMULATION;438
14.3.5;REFERENCES;439
14.4;CHAPTER 96. A HIGH DEGREE OF NOISE TOLERANCE IN HUMAN VISUAL FLOW DISCRIMINATION;440
14.4.1;1. INTRODUCTION;440
14.4.2;2. METHODS;441
14.4.3;3. RESULTS AND DISCUSSIONS;442
14.5;CHAPTER 97. RESPONSE OF DIRECTIONALLY SELECTIVE CELLS OF THE MACAQUE DORSAL MST AREA TO VISUAL FLOW WITH DIRECTIONAL NOISE AND ITS RELATION TO THE NOISE TOLERANCE IN HUMAN VISUAL FLOW DISCRIMINATION;444
14.5.1;1. INTRODUCTION;444
14.5.2;2. METHODS;445
14.5.3;3. RESULTS AND DISCUSSIONS;445
14.5.4;REFERENCES;446
14.6;CHAPTER 98. A DYNAMIC INDUCTION PROCESS FOR LONG-TERM POTENTIATIONIN HIPPOCAMPUS STUDIED BY TEMPORAL PATTERN STIMULATION;448
14.6.1;Methods;448
14.6.2;Results and Discussion;449
14.7;CHAPTER 99. INFORMATION PROCESSING AND LEARNING IN THE OLFACTORY BULB;452
14.7.1;1. Introduction;452
14.7.2;2. Neurobiological background;453
14.7.3;3. The model;453
14.7.4;4. Simulation results;454
14.7.5;5. Discussions;455
14.7.6;Acknowledgement;455
14.7.7;References;455
14.8;CHAPTER 100. PATCH-CLAMP STUDIES OF CULTUREDOLFACTORY BULB NEURONS;458
14.8.1;1. INTRODUCTION;458
14.8.2;2. MATERIALS AND METHODS;459
14.8.3;3. RESULTS;459
14.8.4;4 . CONCLUSIONS;461
14.8.5;ACKNOWLEDGEMENTS;461
14.8.6;REFERENCES;461
14.9;CHAPTER 101. STIMULUS-INDUCED NEURAL SYNCHRONIZATIONS;462
14.9.1;References.;465
14.10;A FORMALIZATION OF NEISSER'S MODEL;466
14.10.1;1. INTRODUCTION;466
14.10.2;2. THE NEISSER'S MODEL;466
14.10.3;3. THE MODEL;467
14.10.4;ACKNOWLEDGEMENTS;469
14.10.5;REFERENCES;469
14.11;CHAPTER 102. MODEL EQUATIONS FOR PASSIVE DENDRITIC INTEGRATIONI N SPATIALLY COMPLEX NEURONS;470
14.11.1;1. INTRODUCTION;470
14.11.2;2. THE NONUNIFORM EQUIVALENT CABLE MODEL;470
14.11.3;3. ANALYSIS OF THE NONUNIFORM CABLE EQUATION;472
14.11.4;4. BRANCHING CONDITION FOR DENDRITIC TREES;473
14.11.5;REFERENCES;473
14.12;CHAPTER 103. EXAMPLES OF REALISTIC NEURAL NETWORK SIMULATIONS;474
14.12.1;Abstract;474
14.12.2;Introduction;474
14.12.3;Simulations;474
14.12.4;Conclusions and Future Research;476
14.12.5;References;477
14.13;CHAPTER 104. CRITICAL DEPENDENCE OF NEURAL NETWORKS PROCESSINGON BETWEEN NEURON DELAYS;478
14.13.1;Abstract:;478
14.13.2;1. INTRODUCTION:;478
14.13.3;2. THEORETICAL CONSIDERATIONS:;478
14.13.4;3. METHODS:;479
14.13.5;4. RESULTS:;479
14.13.6;5. DISCUSSION:;480
14.13.7;6. REFERENCES:;481
14.14;CHAPTER 105. TOOLS FOR BIOLOGICALLY REALISTI CSIMULATIONS OF NEURAL NETWORKS;482
14.14.1;Abstract;482
14.14.2;Introduction;482
14.14.3;The Model Neuron;483
14.14.4;The Specification Language;484
14.14.5;Conclusions and Future Work;484
14.14.6;Acknowledgements;485
14.14.7;References;485
14.15;Chapter 106.
Learning Boolean Functions with an Artificial NeuralNetwork based on an Extended Drive-ReinforcementModel;486
14.15.1;1 Introduction;486
14.15.2;2 Network models of classical conditioning;486
14.15.3;3 The extended drive-reinforcement model;487
14.15.4;4 The network description;488
14.15.5;5 Learning the OR, AND, and XOR functions;489
14.15.6;6 conclusion;489
14.15.7;References;489
14.16;CHAPTER 107. FUNCTIONAL POINTS OF CONTROL IN BIOLOGICAL NEURAL NETWORKS;490
14.16.1;Abstract;490
14.16.2;0 Introduction;490
14.16.3;1 Node Factors;490
14.16.4;2 Node Factor Functions;491
14.16.5;3 Node State;492
14.16.6;4 Network Composition;493
14.16.7;Conclusion and Future Work;493
14.16.8;References;493
14.17;CHAPTER 108. USING PARAMETRIC CONTROLLED STRUCTURED NETWORK TO APPROACH NEURAL NETWORKS TO NEUROSCIEN;494
14.17.1;Abstract;494
14.17.2;Neural Networks and Ncuroscicnccs;494
14.17.3;PARAMETRIC CONTROLLED STRUCTURED NETWORKS;495
14.17.4;References;497
15;Part 11: Software Development;498
15.1;CHAPTER 109. THE HEBB RULE IMPLEMENTATION IN A UNIVERSAL NEURAL NET;500
15.1.1;1. INTRODUCTION;500
15.1.2;2. THE NEURAL NETWORK;500
15.1.3;3. THE HEBBIAN RULE;501
15.1.4;4. THE CONDITIONS TO IMPLEMENT THE HEBBIAN RULE;502
15.1.5;REFERENCES;503
15.2;CHAPTER 110. NEMESYS - NEURAL MODELLING SYSTEM FOR WEIGHTLESS NETS;504
15.2.1;Abstract;504
15.2.2;1.Introduction;504
15.2.3;2. Design methodology;504
15.2.4;3.Conclusion;506
15.2.5;References;506
15.3;CHAPTER 111. EXPERIMENTS WITH PARALLEL BACKPROPAGATION ON A HYPERCUBE PARALLEL PROCESSOR SYSTEM;508
15.3.1;1 INTRODUCTION;508
15.3.2;2 PARALLEL IMPLEMENTATION OF BACKPROPAGATION;508
15.3.3;3 EXPERIMENTAL RESULTS;509
15.3.4;4 CONCLUSIONS;511
15.3.5;5 ACKNOWLEDGEMENTS;511
15.4;Chapter 112. Amalgamating the Neural and Logic Computing Paradigms;512
15.4.1;1. INTRODUCTION;512
15.4.2;2. A LOGICAL MODEL OR INTERPRETATION FOR NEURAL NETS;513
15.4.3;3. EMULATING NEURAL NETS IN CONCURRENT PROLOG;514
15.4.4;5. SUMMARY AND DISCUSSION;515
15.4.5;REFERENCES;515
15.5;CHAPTER 113. Implementation of the SANS algorithm on the CM2;516
15.5.1;Abstract;516
15.5.2;Introduction;516
15.5.3;Description of the SANS algorithm;516
15.5.4;The test problem;517
15.5.5;The actual implementations;517
15.5.6;Conclusion;519
15.5.7;Acknowledgements;519
15.5.8;References;519
15.6;CHAPTER 114. BIOSIM—A PROGRAM FOR BIOLOGICALLY REALISTIC NEURAL NETWORK SIMULATIONS ON THE CONNECTION MACHINE;520
15.6.1;Abstract;520
15.6.2;Introduction;520
15.6.3;Implementation;521
15.6.4;Size and speed;522
15.6.5;Discussion;523
15.6.6;Acknowledgments;523
15.6.7;References;523
15.7;CHAPTER 115.
PLANNET. A NEW NEURAL NET SIMULATOR;524
15.7.1;Abstract;524
15.7.2;1. Neural nets;524
15.7.3;2. PLATO;525
15.7.4;3. A piecewise linear model of a Hopfield neuron;525
15.7.5;4. A new neural net simulator;526
15.7.6;5. Results;526
15.7.7;6. Conclusions;527
15.7.8;7. References;527
15.8;CHAPTER 116. FUNCTIONAL SPECIFICATION OF A NEURAL NETWORK;528
15.8.1;Abstract;528
15.8.2;Introduction;528
15.8.3;Functional Specification of a Back-Propagation Network;528
15.8.4;Network Structure;529
15.8.5;Network Behaviour;529
15.8.6;Discussion;530
15.8.7;Future Work;531
15.8.8;References;531
15.9;CHAPTER 117. FULLY CONNECTED NEURAL NETWORKS: SIMULATION ONMASSIVELY PARALLEL COMPUTERS;532
15.9.1;1. INTRODUCTION;532
15.9.2;2. PARALLEL SOLUTION AND PARALLEL SIMULATORS;533
15.9.3;4. CONCLUSIONS AND FUTURE WORK;535
15.9.4;REFERENCES;535
15.10;CHAPTER 118. PARALLEL LEARNING STRATEGIES ON THE EMMA-2 MULTIPROCESSOR;536
15.10.1;ABSTRACT;536
15.10.2;1. INTRODUCTION;536
15.10.3;2. MAPPING STRATEGIES ON EMMA- 2;537
15.10.4;3. EFFICIENCY OF THE MLP IMPLEMENTATION;538
15.10.5;4. EXPERIMENTAL RESULTS AND DISCUSSION;539
15.10.6;References;539
15.11;CHAPTER 119. EFFECTIVE NEURAL NETWORK MODELING IN C;540
15.11.1;1. INTRODUCTION;540
15.11.2;2. SOFTWARE DESIGN;540
15.11.3;3. OBJECTS AND METHODS;541
15.11.4;4. TABLE DISPATCH;543
15.11.5;5. CONCLUSION;543
16;Part 12: Neural Network Architecturesand Algorithms II;544
16.1;CHAPTER 120. SPEEDING-UP BACKPROPAGATION BY DATA ORTHONORMALIZATIONt;546
16.1.1;1. Introduction;546
16.1.2;2. A Distributed Decorrelation Algorithm;547
16.1.3;3. Performance Evaluation;547
16.1.4;4. Conclusions;549
16.1.5;References;549
16.2;CHAPTER 121. DETERMINING WEIGHTS OF THE HOPFIELD NEURAL NETWORKS;550
16.2.1;1. INTRODUCTION;550
16.2.2;2. THE HOPFIELD MODEL;550
16.2.3;3. DETERMINATION OF WEIGHTS;551
16.2.4;4. TRAVELING SALESMAN PROBLEM;551
16.2.5;5. NUMERICAL CALCULATIONS;552
16.2.6;6. DISCUSSIONS OF RESULTS;552
16.2.7;REFERENCES;553
16.3;CHAPTER 122. RECURRENT AND FEEDFORWARD BACKPROPAGATION: PERFORMANCE STUDIES;554
16.3.1;1 Introduction;554
16.3.2;2 The Underlying Theory;554
16.3.3;3 Architecture and Topology of the unified description;555
16.3.4;4 Performance Comparisons;555
16.3.5;5 Acknowledgements;557
16.3.6;References;557
16.4;CHAPTER 123. THE NORMALIZED BACKPROPAGATION AND SOME EXPERIMENTS ON SPEECH RECOGNITION;558
16.4.1;ABSTRACT;558
16.4.2;1. INTRODUCTION;558
16.4.3;2. DESCRIPTION OF THE NORMALIZED BACKPROPAGATION ALGORITHM;559
16.4.4;3. DESCRIPTION OF THE EXPERIMENT;560
16.4.5;4. BRIEF DESCRIPTION OF THE ALGORITHMS;560
16.4.6;5 . RESULTS OF THE EXPERIMENTS;560
16.4.7;6. CONCLUSIONS;561
16.4.8;REFERENCES.;561
16.5;Chapter 124. An optimum weights initialization for improving scaling relationships in BP learning;562
16.5.1;1 Brief description of the AMBP algorithm;562
16.5.2;2 Computer simulation;563
16.5.3;References;565
17;Part 13: Hardware Implementations;566
17.1;CHAPTER 125. LIQUID CRYSTAL OPTICAL RECTIFICATION FOR SIMULATIONOF VISUAL EXCITATION AND POTENTIAL APPLICATION TOPATTERN RECOGNITION;568
17.1.1;1. INTRODUCTION;568
17.1.2;2. PHYSICAL MECHANISM OF LC-SVE;568
17.1.3;3. APPLICATION TO PATTERN RECOGNITION;570
17.1.4;4 . CONCLUSION;571
17.1.5;ACKNOWLEDGMENT;571
17.1.6;REFERENCES;571
17.2;CHAPTER 126. OPTICAL TAG NEURAL NETWORKS FOR LARGE-SCALEI MPLEMENTATION;572
17.2.1;1. INTRODUCTION;572
17.2.2;2. TAG NEURAL NETWORK MODEL;572
17.2.3;3. OPTICAL IMPLEMENTATION;573
17.2.4;4. CONCLUSION;574
17.2.5;Acknowledgement:;574
17.2.6;References;575
17.3;CHAPTER 127. PHOTOREFRACTIVE CRYSTAL WAVEGUIDES FOROPTICAL NEURAL NETWORKS;576
17.3.1;I . INTRODUCTION;576
17.3.2;I I . APPLICATION OF PCW ARRAY TO OPTICAL NEURAL NETWORKS;576
17.3.3;III. HOLOGRAPHIC STORAGE AND RECONSTRUCTION IN PCW;577
17.3.4;IV. WAVEGUIDE PHASE CONJUGATE MIRROR;579
17.3.5;V. CONCLUSION;579
17.3.6;REFERENCES;579
17.4;CHAPTER 128. TRANSPUTER IMPLEMENTATIONS OF NEURAL NETWORKS: AN ANALYSIS;580
17.4.1;1. INTRODUCTION;580
17.4.2;2. PERFORMANCE ANALYSIS;580
17.4.3;3. CONCLUSION;583
17.4.4;References;583
17.5;CHAPTER 128. SELF-ORGANIZING LOGIC USING SIMULATED ANNEALING;584
17.5.1;1. Introduction;584
17.5.2;2. Architecture;585
17.5.3;3. Teaching;586
17.5.4;4. Simulated Annealing;586
17.5.5;5. Realization;587
17.5.6;6. Conclusions;587
17.5.7;References:;587
17.6;CHAPTER 129. A CLASSIFIER CIRCUIT BASED ON AN EXPONENTIAL-CAPACITY ASSOCIATIVE MEMORY;588
17.6.1;1. Introduction to the ECAM;588
17.6.2;2. The New Classifier Chip;589
17.6.3;3. Conclusions;591
17.6.4;Acknowledgement:;591
17.6.5;References;591
17.7;CHAPTER 130. A METHOD FOR DESIGNING SYSTOLIC ARCHITECTURES FOR MODELLING SPATIOTEMPORAL PROPERTIES OF NEURONS USING DOMAIN DECOMPOSITION;592
17.7.1;1 INTRODUCTION;592
17.7.2;2 MODELING SPATIO-TEMPORAL PROPERTIESOF NEURONS;592
17.7.3;3 A SYSTOLIC METHOD FOR SOLVING A FINITE DIFFERENCE APPROXIMATION OF THE CABLE EQUATION;593
17.7.4;4 DESIGNING ARRAY ARCHITECTURES;593
17.7.5;5 SIMULATION EXPERIMENTS;594
17.7.6;6 DISCUSSION;595
17.7.7;REFERENCES;595
17.8;Chapter 131. A Coded Block Adaptive Neural Network Structure for pattern recognition VLSI;596
17.8.1;Abstract;596
17.8.2;Summary;596
17.8.3;Acknowledgments;597
17.8.4;References;597
17.9;CHAPTER 132. FAULT TOLERANCE IN ANALOG NEURAL NETWORKS;600
17.9.1;1. INTRODUCTION;600
17.9.2;2. LOGIC AND ANALOGIC MODELS OF FAILURES;600
17.9.3;3. REDUNDANCY AND FAULT TOLERANCE;601
17.9.4;4. CONCLUSION;602
17.9.5;5. REFERENCES;602
17.10;CHAPTER 133. DIGITAL VLSI ARCHITECTURE OF BACKPROPAGATION ALGORITHM WITH ON-CHIP LEARNING;604
17.10.1;1. INTRODUCTION;604
17.10.2;2. BASIC STRUCTURES;605
17.10.3;3. ARCHITECTURE AND IMPLEMENTATION;606
17.10.4;4. CONCLUSIONS;607
17.10.5;REFERENCES;607
17.11;CHAPTER 134. A DIGITAL SIGNAL PROCESSOR FOR SIMULATING BACK-PROPAGATION NETWORKS;608
17.11.1;INTRODUCTION;608
17.11.2;DATA PRECISION;609
17.11.3;THE ARCHITECTURE OF THE BP-DSP;610
17.11.4;THE PIPELINED EXECUTION OF ALGORITHMS;610
17.11.5;RESULTS;611
17.11.6;REFERENCES;611
17.12;CHAPTER 135. VLSI-IMPLEMENTATION OF A PULSE-DENSITY-MODULATED NEURAL NETWORK FOR PC-CONTROLLED COMPUTING ENVIRONMENT;612
17.12.1;1. INTRODUCTION;612
17.12.2;2. CONCEPT OF PULSE-DENSITY-MODULATION;612
17.12.3;3. VLSI IMPLEMENTATION;613
17.12.4;4. THE COMPLETE NEURAL PROCESSING SYSTEM;614
17.12.5;5. CONCLUSIONS;615
17.12.6;REFERENCES;615
17.13;CHAPTER 136. A SERIAL-UPDATE VLSI ARCHITECTURE FOR THE LEARNING PROBABILISTIC RAM NEURON;616
17.13.1;1. Introduction;616
17.13.2;2. The pRAM model;616
17.13.3;3. Supervised Learning;617
17.13.4;4. Reinforcement Training;617
17.13.5;5. Hardware Learning;618
17.13.6;6. Serial Architecture;618
17.13.7;7. Expansion beyond 128 neurons;619
17.13.8;8. Conclusion;619
17.13.9;ACKNOWLEDGEMENTS;619
17.13.10;REFERENCES;619
17.14;CHAPTER 137.
ANALOGUE CIRCUIT FOR ELECTRONIC NEURAL NETWORK;620
17.14.1;I. INTRODUCTION;620
17.14.2;II. FULLY PROGRAMMABLE ANALOGUE SYNAPSES;620
17.14.3;III. CONCLUSION/DISCUSSION;621
17.14.4;REFERENCES;622
17.15;CHAPTER 138. VLSI-IMPLEMENTATION OF A PROGRAMMABLE DUAL COMPUTING CELLULAR NEURAL NETWORK PROCESSOR;624
17.15.1;1. INTRODUCTION;624
17.15.2;2. A PROGRAMMABLE ANALOG CNN CORE PROCESSOR;624
17.15.3;3. A CONTROL-CHIP FOR ANALOG ARRAY PROCESSORS AND PROGRAMMABLE CELLULAR NEURAL NETWORKS;625
17.15.4;4. CONCLUSION;626
17.15.5;REFERENCES;627
18;Part 14: Pattern Recognitionand Signal Processing II;628
18.1;CHAPTER 139. FREQUENCY DIFFERENCE SPECTRA AND THEIR USE IN ACOUSTIC PROCESSING;630
18.1.1;Abstract;630
18.1.2;1. INTRODUCTION;630
18.1.3;2. THEORETICAL FRAME;630
18.1.4;CONCLUSIONS;633
18.1.5;BIBLIOGRAPHY;633
18.2;CHAPTER 140. TIME-DEPENDENT SELF-ORGANIZING MAPS FOR SPEECH RECOGNITION;634
18.2.1;1. INTRODUCTION;634
18.2.2;2. TIME-DEPENDENT SELF-ORGANIZING MAPS;634
18.2.3;3. SPEECH RECOGNITION SYSTEM SETUP;635
18.2.4;4. SPEECH RECOGNITION ACCURACY;636
18.2.5;5. CONCLUSIONS;636
18.2.6;References;637
18.3;CHAPTER 141. RECURRENT NEURAL NETWORKS AS PHONEME SPOTTERS;638
18.3.1;1. INTRODUCTION;638
18.3.2;2. GOALS OF THE EXPERIMENT;638
18.3.3;3. PREPROCESSING AND NETWORK STRUCTURE;639
18.3.4;4. EXPERIMENTS AND RESULTS;640
18.3.5;5. CONCLUSIONS;641
18.3.6;References;641
18.4;CHAPTER 142. PHONEME CLASSIFICATION BASED ON THE 2-DIMENSI0NAL FORMANT DETECTION;642
18.4.1;1. INTRODUCTION;642
18.4.2;2. GAUSSIAN KERNEL;642
18.4.3;3. PHONEME CLASSIFICATION PROCESSOR;643
18.4.4;4. EXPERIMENTS;645
18.4.5;5. REMARKS;645
18.4.6;REFERENCES;646
18.5;CHAPTER 143. A SEQUENTIAL NEURAL NETWORK MODEL FOR LEXICAL DECODING INCONTINUOUS SPEECH RECOGNITION;648
18.5.1;1. INTRODUCTION;648
18.5.2;2. ELMAN'S MODEL;649
18.5.3;3. ADAPTATION OF ELMAN'S MODEL TO THE LEXICAL DECODING PROBLEM;649
18.5.4;4. EXPERIMENTAL RESULTS ON THREE SIMILAR PATTERNS;649
18.5.5;5. EXPERIMENTAL RESULTS ON A SIMPLE SENTENCE;650
18.5.6;6. CONCLUSION;651
18.5.7;REFERENCES;651
18.6;CHAPTER 144. RECOGNITION OF CORRUPTED LEXICAL PATTERNS;652
18.6.1;1. INTRODUCTION;652
18.6.2;2. VOCABULARY AND KINDS OF ERROR;652
18.6.3;3. ARCHITECTURE AND TRAINING OF THE NETWORK;653
18.6.4;4. INPUT AND OUTPUT REPRESENTATIONS;653
18.6.5;5. LEARNING PARAMETERS;654
18.6.6;6. EXPERIMENTS AND RESULTS;654
18.6.7;7. CONCLUSIONS;655
18.6.8;REFERENCES;655
18.7;CHAPTER 145. PHONEME RECOGNITION USING ARTIFICIAL NEURAL NETWORKS;656
18.7.1;1 INTRODUCTION;656
18.7.2;2 SPEECH MATERIAL AND DESCRIPTION OF THE USED NETWORKS;656
18.7.3;3 RESULTS;657
18.7.4;4 CONCLUSIONS;659
18.7.5;ACKNOWLEDGEMENTS;659
18.7.6;REFERENCES;659
18.8;CHAPTER 146. CLASSIFICATION OF FLOW PATTERNS IN TWO PHASE FLOW BY NEURAL NETWORK;660
18.8.1;1. INTRODUCTION;660
18.8.2;2. DEFINITION OF THE PROBLEM;660
18.8.3;3. NEURAL NETWORK;661
18.8.4;4. HYBRID ALGORITHM;662
18.8.5;5. IDENTIFICATION OF FLOWS;662
18.8.6;REFERENCES;663
18.9;CHAPTER 147. EMG DIAGNOSIS USING THE CONJUGATE GRADIENT BACKPROPAGATION NEURAL NETWORK LEARNING ALGORITHM;664
18.9.1;ABSTRACT;664
18.9.2;INTRODUCTION;664
18.9.3;THE CONJUGATE GRADIENT LEARNING ALGORITHM;664
18.9.4;RESULTS AND DISCUSSION;666
18.9.5;REFERENCES;667
18.10;CHAPTER 148. GENETICS-BASED-MACHINE-LEARNING IN CLINICAL ELECTROMYOGRAPHY;668
18.10.1;ABSTRACT;668
18.10.2;INTRODUCTION;668
18.10.3;METHODS;668
18.10.4;RESULTS AND DISCUSSION;670
18.10.5;REFERENCES;671
18.11;CHAPTER 149. SELECTION OF CHARACTERISTIC PATTERNS IN MAGNETIC AND ELECTRIC ENCEPHALOGRAMS USING A NEURAL NETWORK;672
18.11.1;1. INTRODUCTION;672
18.11.2;2. EXPERIMENTAL PROCEDURES;672
18.11.3;3. CLASSIFICATION SCHEME;673
18.11.4;4. RESULTS;675
18.12;CHAPTER 150. A METHOD OF SPATIAL SPECTRUM ESTIMATION USING NEURAL NETWORKS;676
18.12.1;1 INTRODUCTION;676
18.12.2;2 MODEL OF ARRAY SIGNALS;676
18.12.3;3 COMPUTE THE EIGENVALUE / EIGENVECTOR BY APEX;677
18.12.4;4 DIRECTION FINDING;677
18.12.5;5 DETERMMINE THE NUMBER OF SOURCES;678
18.12.6;6 CONCLUSION;679
18.12.7;Reference;679
18.13;CHAPTER 151. FREQUENCY ESTIMATION BY A HEBBIAN SUBSPACE LEARNING ALGORITHM;680
18.13.1;Abstract;680
18.13.2;1 Introduction;680
18.13.3;2 The learning algorithm;680
18.13.4;3 Stability considerations;681
18.13.5;4 Sinusoidal frequency estimation;682
18.13.6;5 Concluding remarks;683
18.13.7;References;683
18.14;CHAPTER 152. NEURAL NETWORKS FOR PROSODY CONTROL IN SPEECH SYNTHESIS;684
18.14.1;1. INTRODUCTION;684
18.14.2;2. TRADITIONAL SPEECH SYNTHESIS B Y RULE;684
18.14.3;3. NEURAL NETS FOR THE GENERATION OF CONTROL PARAMETERS;685
18.14.4;4. PHONEME DURATIONS B Y NEURAL NETS;685
18.14.5;5. INTONATION AND STRESS CONTROL B Y NEURAL NETS;687
18.14.6;6. SUMMARY AND CONCLUSIONS;687
18.14.7;REFERENCES;687
18.15;CHAPTER 153. A SIMPLE LOOK-UP PROCEDURE SUPERIOR TO NETTALK?;688
18.15.1;1 INTRODUCTION;688
18.15.2;2 A SIMPLE LOOK-UP PROCEDURE;689
18.15.3;3 THE PERFORMANCE OF THE LOOK-UP PROCEDURE;690
18.15.4;4 THE PERFORMANCE OF NETTALK;690
18.15.5;5 EVALUATION AND CONCLUSION;691
18.15.6;ACKNOWLEDGEMENTS;691
18.15.7;REFERENCES;691
18.16;CHAPTER 154. COMPARISON AND COOPERATION OF SEVERAL CLASSIFIERS;692
18.16.1;1 INTRODUCTION;692
18.16.2;2 A MULTI-SPEAKER ISOLATED WORD RECOGNITION PROBLEM;692
18.16.3;3 COMPARISON OF SEVERAL METHODS;693
18.16.4;4 A SIMPLE SUB-OPTIMAL COOPERATION METHOD: FEATURE EXTRACTION;694
18.16.5;5 OPTIMAL COOPERATION WITH DYNAMIC PROGRAMMING;695
18.16.6;6 CONCLUSION;696
18.16.7;ACKNOWLEDGEMENT;696
18.16.8;REFERENCES;696
18.17;CHAPTER 155. NEURAL NETS AND TASK DECOMPOSITION;698
18.17.1;1. INTRODUCTION;698
18.17.2;2. DECOMPOSITION OF THE TASK AND INTRODUCTION OF MODULARITYIN NEURAL NETS;698
18.17.3;3. RESULTS;700
18.17.4;4. CONCLUSION;701
18.17.5;REFERENCES;701
19;Part 15: Commercial and IndustrialApplications;702
19.1;CHAPTER 156. CONSTRUCTION OF CONTIGUOUS DNA SEQUENCES FROM N-TUPLECONTENT USING GENERALIZING RAM NEURON MODEL;704
19.1.1;Abstract;704
19.1.2;1.Introduction;704
19.1.3;2. The G-RAM;704
19.1.4;4. Simulation and Results;707
19.1.5;5. Conclusion;707
19.1.6;REFERENCES;707
19.2;CHAPTER 157. A HYBRID NEURAL NET/KNOWLEDGE BASED APPROACHTO EEG ANALYSIS;708
19.2.1;1. INTRODUCTION:;708
19.2.2;2. DESCRIPTION OF THE HYBRID SYSTEM;708
19.2.3;3. RESULTS:;710
19.2.4;4. CONCLUSION;711
19.2.5;REFERENCES:;711
19.3;CHAPTER 158. TRAFFIC MONITORING WITH WISARD AND PROBABILISTIC LOGIC NODES;712
19.3.1;1. INTRODUCTION AND PREVIOUS WORK;712
19.3.2;2. SIMULATION DETAILS;713
19.3.3;3. RESULTS;714
19.3.4;4. DISCUSSION;715
19.3.5;5. REFERENCES;715
19.4;CHAPTER 159. REALTIME ECG DATA COMPRESSION USINGDUAL THREE LAYERED NEURAL NETWORKS FOR DIGITAL HOLTER MONITOR;716
19.4.1;ABSTRACT;716
19.4.2;1. Introduction;716
19.4.3;2. Data Compression by Neural Networks;716
19.4.4;3. Evaluation of Performance;718
19.4.5;4. Conclusion;719
19.4.6;REFERENCES;719
19.5;CHAPTER 160. PERFORMANCE EVALUATION OF SELF-ORGANIZING MAP BASEDNEURAL EQUALIZERS IN DYNAMIC DISCRETE-SIGNAL DETECTION;720
19.5.1;1. INTRODUCTION;720
19.5.2;2. ADAPTIVE EQUALIZATION BASED ON SELF-ORGANIZING MAPS;720
19.5.3;3. COMBINING LINEAR EQUALIZATION AND SELF-ORGANIZING ADAPTATION;721
19.5.4;4. PERFORMANCE EVALUATION OF THE NEURAL EQUALIZERS;721
19.5.5;5. DISCUSSION;722
19.5.6;ACKNOWLEDGEMENT;723
19.5.7;REFERENCES;723
19.6;Chapter 161. A Hopfield-like structure to extract road boundaries from a road image;724
19.6.1;Abstract;724
19.6.2;1 Introduction;724
19.6.3;2 Extracting Road Boundaries;724
19.6.4;3 Conclusions;727
19.6.5;References;727
19.7;Chapter 162. The Impact of the Learning-Set Sizein Handwritten—Digit Recognition;728
19.7.1;1. Introduction;728
19.7.2;2. Statistical Classifiers;729
19.7.3;3. Results;730
19.7.4;4. Conclusion;732
19.7.5;References;732
19.8;CHAPTER 163. A KIND OF GENERALIZED HOPFIELD CONTINUOUS MODEL AND ITS APPLICATION TO THE OPTIMAL DISTRIBUTION OF REACTIVE POWER SOURCES IN POWER SYSTEMS;734
19.8.1;1. INTRODUCTION;734
19.8.2;2. A BRIEF INTRODUCTION TO HCM;734
19.8.3;3. GENERALIZED HOPFIELD CONTINUOUS MODEL (GHCM);735
19.8.4;3. THE OPTIMAL DISTRIBUTION OF REACTIVE POWER SOURCES ( ODRPS )IN POWER SYSTEMS;736
19.8.5;4. RESULTS;737
19.8.6;5. CONCLUSIONS;737
19.8.7;REFERENCES;737
19.9;CHAPTER 164. SIGNIFICANT VARIABLE PAIRS WHICH EFFECT NEURAL NETWORK DERIVED PERSONNEL SCREENING SCORES;738
19.9.1;1. INTRODUCTION;738
19.9.2;2. Sensitivity Studies of Paired Variables;738
19.9.3;3. Basic Ideas About Layered Neural Network Models;739
19.9.4;4. Neural Network Architecture;739
19.9.5;5. Data Preparation and Training Procedure;739
19.9.6;6. Computer Experiments with the Trained Network;739
19.9.7;7. Significance Test Used for Identifying Strongly Interacting Pairs of Variables;739
19.9.8;8. Pairs Identified by Linear Classifier with Values in Proxy Variables;740
19.9.9;9. Variables Interacting in Neural Nets Trained on Individual Sales Performance;741
19.9.10;10. Conclusions;741
19.9.11;11. References;741
19.10;CHAPTER 165. APPLICATION OF THE SELF-ORGANISING FEATURE MAP AND LEARNINGVECTOR QUANTISATION TO RADAR CLUTTER CLASSIFICATION;742
19.10.1;1. INTRODUCTION;742
19.10.2;2. CLASSIFICATION EXPERIMENTS;742
19.10.3;3. TOPOLOGICAL STRUCTURE OF DATA;743
19.10.4;4. SUMMARY;745
19.10.5;REFERENCES;745
19.11;CHAPTER 166. APPLICATIONS OF NEURAL NETWORKS IN CONTROL TECHNOLOGY;746
19.11.1;1. INTRODUCTION;746
19.11.2;2. WELD CONTROL;746
19.11.3;3. INTELLIGENT NDT;748
19.11.4;4. CONCLUSIONS;749
19.11.5;REFERENCES;749
19.12;CHAPTER 167. FINANCIAL DATA RECOGNITION AND PREDICTION USING NEURAL NETWORKS;752
19.12.1;1 FINANCIAL MARKETS;752
19.12.2;2 THE M.L.P. SOLUTION;753
19.12.3;3 KOHONEN SELF ORGANISING FEATURE MAPS;754
19.12.4;4 KOHONEN AND M.L.P.;755
19.12.5;5 CONCLUSIONS;755
20;Part 16: Neural Models for Cognitive Science and High-Level Brain Functions;756
20.1;Chapter 168. Learning to Classify Natural Language Titles in a Recurrent Connectionist Model;758
20.1.1;Abstract;758
20.1.2;1 Introduction;758
20.1.3;2 Learning Library Titles in a Recurrent Connectionist Network;758
20.1.4;3 Analysis of the Learned Internal Representation;760
20.1.5;4 Conclusion;761
20.1.6;Acknowledgements;761
20.1.7;References;761
20.2;CHAPTER 170. THE REPRESENTATION OF ABSTRACT ENTITIES IN ARTIFICIAL NEURAL NETWORKS;762
20.2.1;1. INTRODUCTION;762
20.2.2;2. A FRAME FOR CONCEPTUAL MODELLING;763
20.2.3;3. THE CAUSAL THEORY OF REFERENCE;763
20.2.4;4. FUNCTANTS AND THEIR ROLE IN CONCEPTUAL PROCESSES;764
20.2.5;References;765
20.3;CHAPTER 171. A NETWORK-MODEL FOR BINOCULAR INTERACTION IN THE THALAMO-CORTICAL FEEDBACK-SYSTEM;766
20.3.1;1. INTRODUCTION;766
20.3.2;2. MODEL;766
20.3.3;2. RESULTS;767
20.3.4;ACKNOWLEDGEMENT;769
20.3.5;References;769
20.4;CHAPTER 172. A NEURAL NET MODEL OF SPELLING DEVELOPMENT;770
20.4.1;Introduction;770
20.4.2;A connectionist model of spelling development;771
20.4.3;Testing the model's predictions: Normal spelling development;772
20.4.4;References;773
20.5;CHAPTER 173. NEURAL NETS FOR INTELLIGENT TUTORING SYSTEMS;774
20.5.1;Abstract;774
20.5.2;1. Introduction: intelligent tutoring systems;774
20.5.3;2. Neural networks for ITS;774
20.5.4;3. Modelling a student who is solving physics problems;775
20.5.5;4. Conclusions;777
20.5.6;Acknowledgments;777
20.5.7;References;777
20.6;CHAPTER 174. AUTONOMOUS CONTROL OF SELECTIVE ATTENTION: SCAN ARCHITECTURES;778
20.6.1;1. Introduction;778
20.6.2;2. Shifter-circuit architecture;778
20.6.3;3. Competitively coupled layers;779
20.6.4;4. Autonomous control of selective attention in SCAN;780
20.6.5;5. Discussion;781
20.6.6;Acknowledgments;781
20.6.7;References;781
20.7;CHAPTER 175. SPATIOTEMPORAL CORRELATION IN THE CEREBELLUM;782
20.7.1;Introduction;782
20.7.2;Temporally organised information;782
20.7.3;Simulations of the mechanism;784
20.7.4;Conclusions;785
20.7.5;Acknowledgments;785
20.7.6;References;785
20.8;CHAPTER 176. HARMONY THEORY NETWORKS FOR SCENE ANALYSIS;786
20.8.1;1. INTRODUCTION - TRADITIONAL A.I. SCENE ANALYSIS;786
20.8.2;2. THE P.D.P. APPROACH TO A.I. CONSTRAINT PROPAGATION;786
20.8.3;3. RESULTS;788
20.8.4;4. CONCLUSIONS;789
20.8.5;REFERENCES;789
20.9;CHAPTER 177. MODELING HEBBIAN CELL ASSEMBLIESCOMPRISED OF CORTICAL NEURONS;790
20.9.1;Introduction;790
20.9.2;Cell model and network architecture;791
20.9.3;Simulation results;791
20.9.4;Discussion;792
20.9.5;Conclusions;793
20.9.6;Acknowledgement;793
20.9.7;References;793
20.10;Chapter 178. Recurrent Kohonen Self-Organization in Natural Language Processing;794
20.10.1;1. Introduction;794
20.10.2;2. Formal Description of the Model;795
20.10.3;3. Simulation Results;796
20.10.4;4. Discussion and Conclusions;797
20.10.5;Acknowledgements;797
20.10.6;References;797
20.11;Chapter 179. A Multimodal Model of Cognition - Neural Networks and Cognitive Modeling;798
20.11.1;1 Introduction & Some Methodological Remarks;798
20.11.2;2 The Structure of the Simulation;798
20.11.3;Conclusion;801
20.11.4;References;801
20.12;CHAPTER 180. LABOUR, CONSUMPTION AND FAMILY ASSETS: A NEURAL NETWORKLEARNING FROM ITS OWN CROSS-TARGETS;802
20.12.1;1 . INTRODUCTION;802
20.12.2;2 . THE ECONOMIC SUBJECT MODEL;802
20.12.3;3. THE LEARNING AND ACTING ALGORITHM;803
20.12.4;4. A CONSISTENT BEHAVIOUR;804
20.12.5;5. DISCOVERING ECONOMIC REGULARITIES;804
20.12.6;6 . FUTURE IMPROVEMENTS;805
20.12.7;REFERENCES;805
20.13;CHAPTER 181. NEURAL NETWORKS, GENETIC ALGORITHMS AND STOCK TRADING;806
20.13.1;1 . INTRODUCTION;806
20.13.2;2 . THE STOCK TRADER NETWORK;806
20.13.3;3 . THE ENVIRONMENT;806
20.13.4;4 . THE GENETIC ALGORITHM;807
20.13.5;5 . THE HIDDEN NEURONS PROBLEM;808
20.13.6;6 . FUTURE IMPROVEMENTS AND CONCLUDING REMARKS;809
20.13.7;REFERENCES;809
20.14;CHAPTER 182. ACOUSTIC ILLUSIONS: EXPECTATION DIRECTED FILTERING IN THE HUMAN AUDITORY SYSTEM;810
20.14.1;1. INTRODUCTION;810
20.14.2;2. EXPECTATION DRIVEN PREPROCESSING;811
20.14.3;3. EXPERIMENTAL RESULTS;811
20.14.4;4. SPEECH PROCESSING;812
20.14.5;5. BIOLOGICAL SPEECH RECOGNITION;812
20.14.6;6. CONCLUSION;813
20.14.7;References;813
20.15;Chapter 183. Synchronization of Spikes in Populations of Laterally Coupled Model Neurons;814
20.15.1;1. INTRODUCTION;814
20.15.2;2. THE MODEL NEURON;814
20.15.3;3. SYNCHRONIZATION OF AXONAL IMPULSE GENERATION;815
20.15.4;4 . RESULTS;816
20.15.5;5. DISCUSSION;817
20.15.6;REFERENCES;817
21;Part 17: Neural Network Architectures and Algorithms III;818
21.1;CHAPTER 184. CHARACTER RECOGNITION BY A NEURAL NETWORKWITH FUZZY PARTITIONING UNITS;820
21.1.1;ABSTRACT;820
21.1.2;I. INTRODUCTION;820
21.1.3;2. FUZZY PARTITIONING UNIT ( FPU );821
21.1.4;3. LAYERED NEURAL NKTWORKS AND ERROR FUNCTIONS;821
21.1.5;4. COMPARISONS OF FOI/H I.HARNING ALGORITHMS;822
21.1.6;5 . CONCLUSI ON;823
21.1.7;REFERENCES;823
21.2;Chapter 185. A Global Minimum Convergence Acceleration Technique Using a k–out–of–n Network Design Rule;824
21.2.1;1 Introduction;824
21.2.2;2 Performance of the Preliminary Model;824
21.2.3;3 Structural Analysis of the Model;824
21.2.4;4 Dual Phase Simulated Annealing Method;825
21.2.5;5 Results;826
21.2.6;References;826
21.3;CHAPTER 186. APPROXIMATION CAPABILITIES OF NEURAL NETWORKS USING SAMPLING FUNCTIONS;828
21.3.1;1 Introduction;828
21.3.2;2 The architecture and learning algorithm;828
21.3.3;3 Summary;831
21.4;CHAPTER 187.ADDITION AND SUBTRACTION IN NEURAL NETS AS RESULTS OF A LEARNING PROCESS;832
21.4.1;1. INTRODUCTION;832
21.4.2;2. THE MODEL;832
21.4.3;3. THE BEHAVIOUR OF AN ADDER NET;833
21.4.4;4. SUBTRACTOR;835
21.4.5;5. CONCLUDING REMARKS;835
21.4.6;ACKNOWLEDGEMENTS;835
21.4.7;REFERENCES;835
21.5;CHAPTER 188. GEOMETRICAL LEARNING IN A NETWORK OF AUTOMATA;836
21.5.1;1. INTRODUCTION;836
21.5.2;2. VORONOI TESSELLATION IN AN N-DIMENSIONAL SPACE;836
21.5.3;3. BUILDING THE NETWORK FROM THE DELAUNAY STRUCTURE;837
21.5.4;4. LEARNING ALGORITHM : BUILDING THE DELAUNAY STRUCTURE;838
21.5.5;5. DISCUSSION;839
21.5.6;REFERENCES;839
21.6;Chapter 189. A Neural Network for Solving Hamiltonian Cycle Problems;840
21.6.1;1 Introduction;840
21.6.2;2 Problem Representation and Transformation Rules;840
21.6.3;3 The minimization of the Energy Function;841
21.6.4;4 Experimental Results;842
21.6.5;References;843
22;Part 18: Late Papers;844
22.1;CHAPTER 189. SHIFT-TOLERANT" LVQ2-BASED DIGITS RECOGNITION;846
22.1.1;1. THE VOCAL DATABASES;846
22.1.2;2. SYSTEM ARCHITECTURE;847
22.1.3;3. RESULTS;849
22.1.4;ACKNOWLEDGMENTS;850
22.1.5;BIBLIOGRAPHY;850
22.2;CHAPTER 190. ON THE MATHEMATICAL TREATMENT OF SELF-ORGANIZATION: EXTENSION OF SOME CLASSICAL RESULTS;852
22.2.1;1. Introduction: self-organizing basics;852
22.2.2;2. The topologically extended justification of the ordering of the weights;852
23;BIBLIOGRAPHY;855
24;Author Index;856



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.