Tenne / Goh | Computational Intelligence in Expensive Optimization Problems | E-Book | www2.sack.de
E-Book

E-Book, Englisch, Band 2, 800 Seiten

Reihe: Adaptation, Learning, and Optimization

Tenne / Goh Computational Intelligence in Expensive Optimization Problems


1. Auflage 2010
ISBN: 978-3-642-10701-6
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, Band 2, 800 Seiten

Reihe: Adaptation, Learning, and Optimization

ISBN: 978-3-642-10701-6
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark



In modern science and engineering, laboratory experiments are replaced by high fidelity and computationally expensive simulations. Using such simulations reduces costs and shortens development times but introduces new challenges to design optimization process. Examples of such challenges include limited computational resource for simulation runs, complicated response surface of the simulation inputs-outputs, and etc. Under such difficulties, classical optimization and analysis methods may perform poorly. This motivates the application of computational intelligence methods such as evolutionary algorithms, neural networks and fuzzy logic, which often perform well in such settings. This is the first book to introduce the emerging field of computational intelligence in expensive optimization problems. Topics covered include: dedicated implementations of evolutionary algorithms, neural networks and fuzzy logic. reduction of expensive evaluations (modelling, variable-fidelity, fitness inheritance), frameworks for optimization (model management, complexity control, model selection), parallelization of algorithms (implementation issues on clusters, grids, parallel machines), incorporation of expert systems and human-system interface, single and multiobjective algorithms, data mining and statistical analysis, analysis of real-world cases (such as multidisciplinary design optimization). The edited book provides both theoretical treatments and real-world insights gained by experience, all contributed by leading researchers in the respective fields. As such, it is a comprehensive reference for researchers, practitioners, and advanced-level students interested in both the theory and practice of using computational intelligence for expensive optimization problems.

Tenne / Goh Computational Intelligence in Expensive Optimization Problems jetzt bestellen!

Autoren/Hrsg.


Weitere Infos & Material


1;Preface;7
2;Contents;13
3;Part I Techniques for Resource-Intensive Problems;28
3.1;A Survey of Fitness Approximation Methods Applied in Evolutionary Algorithms;29
3.1.1;Introduction;29
3.1.2;Fitness Approximation Methods;31
3.1.2.1;Instance-Based Learning Methods;31
3.1.2.2;Machine Learning Methods;33
3.1.2.3;Statistical Learning Methods;35
3.1.2.4;Existing Research in Multi-surrogate Assisted EAs;37
3.1.3;Comparative Studies for Different Approximate Models;40
3.1.4;The Working Styles of Fitness Approximation;43
3.1.4.1;Direct Fitness Replacement Methods;43
3.1.4.2;Indirect Fitness Approximation Methods;43
3.1.5;The Management of Fitness Approximation;44
3.1.5.1;Evolution Control;44
3.1.5.2;Offline Model Training;45
3.1.5.3;Online Model Updating;45
3.1.5.4;Hierarchical Approximate Models and Model Migration;46
3.1.6;Case Studies: Two Surrogate-Assisted EA Real-World Applications;46
3.1.6.1;The Welded Beam Design Domain;46
3.1.6.2;Supersonic Aircraft Design Domain;47
3.1.7;Final Remarks;49
3.1.8;References;50
3.2;A Review of Techniques for Handling ExpensiveFunctions in Evolutionary Multi-Objective Optimization;55
3.2.1;Introduction;55
3.2.2;Basic Concepts;56
3.2.2.1;Pareto Dominance;57
3.2.2.2;Pareto Optimality;57
3.2.2.3;Pareto Front;58
3.2.3;Knowledge Incorporation;58
3.2.3.1;Surrogates;59
3.2.3.2;Polynomials: Response Surface Methods (RSM);60
3.2.3.3;Gaussian Process or Kriging;61
3.2.3.4;Radial Basis Functions;62
3.2.3.5;Artificial Neural Networks;63
3.2.3.6;Support Vector Machines;64
3.2.3.7;Clustering;66
3.2.3.8;Fitness Inheritance;67
3.2.4;Real-World Applications;68
3.2.4.1;Use of Problem Approximation;69
3.2.4.2;Use of RSM by Polynomial Approximation;71
3.2.4.3;Use of Artificial Neural Networks;72
3.2.4.4;Use of a Gaussian Process or Kriging;74
3.2.4.5;Use of Clustering;78
3.2.4.6;Use of Radial Basis Functions;78
3.2.5;Conclusions and Future Research Paths;79
3.2.6;References;80
3.3;Multilevel Optimization Algorithms Based on Metamodel- and Fitness Inheritance-Assisted Evolutionary Algorithms;86
3.3.1;Introduction;87
3.3.2;Metamodel–Assisted EAs and Distributed MAEAs;89
3.3.3;Surrogate Evaluation Models for MAEAs;90
3.3.3.1;Fitness Inheritance;90
3.3.3.2;Radial Basis Function (RBF) Networks;91
3.3.4;Assessment of MAEA and DMAEA;92
3.3.5;Multilevel Search Algorithms and the Underlying Hierarchy;93
3.3.5.1;The Three Multilevel Modes – Defining a HDMAEA;93
3.3.5.2;Distributed Hierarchical Search – DHMAEA vs. HDMAEA;95
3.3.6;Assessment of Multilevel–Hierarchical Optimization;96
3.3.7;Optimization of an Annular Cascade;100
3.3.8;Conclusions;104
3.3.9;References;105
3.3.10;References;50
3.4;Knowledge-Based Variable-Fidelity Optimization of Expensive Objective Functions through Space Mapping;110
3.4.1;Introduction;110
3.4.2;Space Mapping Optimization;113
3.4.2.1;Formulation of the Space Mapping Algorithm;113
3.4.2.2;Space Mapping Surrogate Models;115
3.4.2.3;Characterization of Space Mapping;116
3.4.2.4;Practical Issues and Open Problems;117
3.4.2.5;Space Mapping Illustration;117
3.4.3;Space Mapping Efficiency;120
3.4.3.1;Example 1: Microstrip Bandpass Filter;120
3.4.3.2;Example 2: Ring Antenna b32;124
3.4.3.3;Discussion;126
3.4.4;Exploiting Extra Knowledge: Tuning Space Mapping;126
3.4.4.1;Tuning Space Mapping Formulation;127
3.4.4.2;TSM Optimization of Chebyshev Bandpass Filter;128
3.4.4.3;Summary;131
3.4.5;Conclusions;131
3.4.6;References;132
3.4.7;References;50
3.5;Reducing Function Evaluations Using Adaptively Controlled Differential Evolution with Rough Approximation Model;135
3.5.1;Introduction;135
3.5.2;Optimization and Approximation Models;137
3.5.2.1;Optimization Problems;137
3.5.2.2;Evolutionary Algorithms Using Approximation Models;137
3.5.2.3;Estimated Comparison Method;138
3.5.3;Rough Approximation Model;139
3.5.3.1;Potential Model;139
3.5.3.2;Estimated Comparison;140
3.5.3.3;Adaptive Control;141
3.5.4;Differential Evolution with the Estimated Comparison Method;142
3.5.4.1;Differential Evolution;142
3.5.4.2;Adaptive DE with the Estimated Comparison Method;143
3.5.5;Numerical Experiments;144
3.5.5.1;Test Problems;144
3.5.5.2;Conditions of Experiments;146
3.5.5.3;Experimental Results;146
3.5.6;Discussion;150
3.5.7;Conclusions;152
3.5.8;References;152
3.6;Kriging Is Well-Suited to Parallelize Optimization;154
3.6.1;Introduction;155
3.6.1.1;Motivations: Efficient Optimization Algorithms for Expensive Computer Experiments;155
3.6.1.2;Where Computational Intelligence and Kriging Meet;155
3.6.1.3;Towards Kriging-Based Parallel Optimization: Summary of Obtained Results and Outline of the Chapter;157
3.6.2;Background in Kriging for Sequential Optimization;158
3.6.2.1;The Ordinary Kriging Metamodel and Its Gaussian Process Interpretation;158
3.6.2.2;Kriging-Based Optimization Criteria;160
3.6.3;The Multi-points Expected Improvement (q-EI) Criterion;165
3.6.3.1;Analytical Calculation of 2-EI;166
3.6.3.2;q-EI Computation by Monte Carlo Simulations;170
3.6.4;Approximated q-EI Maximization;172
3.6.4.1;A First Greedy Strategy to Build a q-Points Design with the 1-Point EI;173
3.6.4.2;The Kriging Believer (KB) and Constant Liar (CL) Strategies;173
3.6.4.3;Empirical Comparisons with the Branin-Hoo Function;174
3.6.5;Towards Kriging-Based Parallel Optimization: Conclusion and Perspectives;178
3.6.6;Appendix;180
3.6.6.1;Gaussian Processes for Machine Learning;180
3.6.6.2;Conditioning Gaussian Vectors;180
3.6.6.3;Simple Kriging Equations;181
3.6.6.4;Ordinary Kriging Equations;182
3.6.7;References;183
3.7;Analysis of Approximation-Based Memetic Algorithms for Engineering Optimization;186
3.7.1;Memetic Algorithms and Computer-Aided Design;187
3.7.2;Approximation-Based Memetic Algorithms;191
3.7.2.1;Varying Accuracy in Black-Box Functions;193
3.7.2.2;Approximation-Based Local Search;194
3.7.3;Analysis of Memetic Algorithms;198
3.7.3.1;Convergence Analysis;198
3.7.3.2;Computational Cost;203
3.7.4;Numerical Results;205
3.7.4.1;Analytical Problems;205
3.7.4.2;Electromagnetic Benchmark Problem;208
3.7.5;Final Remarks;211
3.7.6;References;212
3.8;Opportunities for Expensive Optimization with Estimation of Distribution Algorithms;215
3.8.1;Introduction;215
3.8.2;Estimation of Distribution Algorithms;217
3.8.2.1;Boltzmann Estimation of Distribution Algorithms;217
3.8.3;Three Opportunities for Enhancing the Efficiency of EDAs;219
3.8.3.1;Right-Sized Selection;219
3.8.3.2;Probabilistic Elitism;220
3.8.3.3;Maximum Entropy;221
3.8.4;Evolutionary Backtracking from Log-Probability Landscapes;222
3.8.4.1;Selecting a Bayesian EDA Algorithm;223
3.8.4.2;Partial Evaluation in Boltzmann EDAs;224
3.8.4.3;A Simple Algorithm for Partial Evaluation with Backtracking;227
3.8.5;Regularization in Estimation of Distribution Algorithms;230
3.8.5.1;Entropic Mutation;230
3.8.5.2;Shrinkage Estimation of Distribution Algorithms;232
3.8.6;Summary and Conclusions;235
3.8.7;Appendix;236
3.8.8;References;238
3.9;On Similarity-Based Surrogate Models for Expensive Single- and Multi-objective Evolutionary Optimization;241
3.9.1;Introduction;241
3.9.2;The Optimization Problem;243
3.9.3;Surrogate-Assisted Evolutionary Optimization;243
3.9.3.1;Similarity-Based Surrogate Models (SBSMs);244
3.9.3.2;Surrogate-Assisted Framework;246
3.9.3.3;The Surrogate-Assisted Evolutionary Algorithm;247
3.9.4;Computational Experiments;249
3.9.4.1;Single-Objective Optimization;251
3.9.4.2;Multi-objective Optimization;257
3.9.5;Concluding Remarks;266
3.9.6;References;267
3.10;Multi-objective Model Predictive Control Using Computational Intelligence;271
3.10.1;Introduction;272
3.10.2;Meta-modeling Using Computational Intelligence;272
3.10.3;Aspiration Level Approach to Interactive Multi-objective Optimization;277
3.10.4;Multi-objective Model Predictive Control;279
3.10.5;Concluding Remarks;285
3.10.6;References;285
3.11;Improving Local Convergence in Particle Swarms by Fitness Approximation Using Regression;287
3.11.1;Introduction;287
3.11.2;Background;289
3.11.2.1;Fitness Approximation;289
3.11.2.2;Particle Swarms;290
3.11.2.3;Speciated Particle Swarms;291
3.11.2.4;Guaranteed Convergence PSO;292
3.11.2.5;mQSO;292
3.11.3;Using Regression to Locate Optima;293
3.11.4;Experimental Setup;296
3.11.4.1;Static Functions;297
3.11.4.2;Moving Peaks;299
3.11.5;Results;303
3.11.5.1;Static Functions;303
3.11.5.2;Moving Peaks;305
3.11.6;Conclusion;313
3.11.7;References;313
4;Part II Techniques for High-Dimensional Problems;316
4.1;Differential Evolution with Scale Factor Local Search for Large Scale Problems;317
4.1.1;Introduction;317
4.1.2;Differential Evolution for Large Scale Problems;319
4.1.2.1;Differential Evolution;319
4.1.2.2;Self-Adaptive Control Parameter Update;321
4.1.2.3;Population Size Reduction;321
4.1.2.4;Scale Factor Local Search;322
4.1.3;Numerical Results;325
4.1.3.1;Results in 100 Dimensions;327
4.1.3.2;Results in 500 Dimensions;330
4.1.3.3;Results in 1000 Dimensions;333
4.1.3.4;Discussion about the Algorithmic Components;336
4.1.4;Conclusion;340
4.1.5;References;341
4.2;Large-Scale Network Optimization with Evolutionary Hybrid Algorithms: Ten Years’Experience with the Electric Power Distribution Industry;344
4.2.1;Introduction;344
4.2.2;Optimization of Electric Power Distribution Networks;346
4.2.3;Evolutionary Approach;349
4.2.3.1;Working within the Feasibility Domain;350
4.2.3.2;Lamarckian Hybridization;354
4.2.4;Application Examples and Illustration;356
4.2.5;Summary;361
4.2.6;References;361
4.3;A Parallel Hybrid Implementation Using Genetic Algorithms, GRASP and Reinforcement Learning for the Salesman Traveling Problem;363
4.3.1;Introduction;363
4.3.2;Theoretical Foundation;365
4.3.2.1;GRASP Metaheuristic;365
4.3.2.2;Genetic Algorithm;365
4.3.2.3;Reinforcement Learning: Q-Learning Algorithm;366
4.3.3;Hybrid Methods Using Metaheuristic and Reinforcement Learning;367
4.3.3.1;GRASP-Learning;368
4.3.3.2;Genetic-Learning;369
4.3.4;Parallel Hybrid Implementation Proposed;370
4.3.4.1;Methodology;370
4.3.5;Experimental Results;372
4.3.5.1;The Traveling Salesman Problem;373
4.3.5.2;Computational Test;374
4.3.6;Conclusions;386
4.3.7;References;386
4.4;An Evolutionary Approach for the TSP and the TSP with Backhauls;388
4.4.1;The Traveling Salesman Problem;389
4.4.1.1;Conventional TSP Heuristics;390
4.4.1.2;Metaheuristics for the TSP;391
4.4.1.3;Evolutionary TSP Algorithms;391
4.4.1.4;The TSP with Backhauls;392
4.4.1.5;Outline;393
4.4.2;The First Evolutionary Algorithm for the TSP;393
4.4.2.1;Generating Offspring from the Union Graph;394
4.4.2.2;Nearest Neighbor Crossover (NNX);394
4.4.2.3;Greedy Crossover (GX);396
4.4.2.4;Combined Use of the Crossover Operators;397
4.4.2.5;Proposed Mutation Operators;397
4.4.2.6;Other Settings of the First Evolutionary Algorithm;398
4.4.2.7;Computational Results for the TSP;400
4.4.3;The Second Evolutionary Algorithm for the TSP and the TSPB;404
4.4.3.1;More Than Two Parents and Multiple Offspring;404
4.4.3.2;Improved Mutation Operators;406
4.4.3.3;Computational Results for the TSP;407
4.4.3.4;Computational Results for the TSPB;408
4.4.4;Conclusion;411
4.4.5;References;412
4.5;Towards Efficient Multi-objective Genetic Takagi-Sugeno Fuzzy Systems for High Dimensional Problems;414
4.5.1;Introduction;414
4.5.2;The Takagi-Sugeno Fuzzy Systems;416
4.5.3;Time Complexity;417
4.5.4;Fast Identification of Consequent Parameters;419
4.5.5;Reuse of Computed Parameters;420
4.5.5.1;Reuse in the Application of Mating Operators;420
4.5.5.2;Speeding Up the Calculus of the Output through Reuse;422
4.5.5.3;Speeding Up the Calculus of the Activation Degrees through Reuse;423
4.5.6;The Used MOEA to Learn TS Rules;423
4.5.7;Experimental Results;424
4.5.7.1;Regression Problem;426
4.5.7.2;Time Series Forecasting Problem;430
4.5.8;Conclusions;437
4.5.9;References;438
4.6;Evolutionary Algorithms for the Multi Criterion Minimum Spanning Tree Problem;440
4.6.1;Introduction;440
4.6.1.1;Problem Definition;441
4.6.1.2;Solution Approaches;442
4.6.2;Knowledge-Based Evolutionary Algorithm (KEA);446
4.6.2.1;Phases of KEA;447
4.6.3;Experimental Results;449
4.6.3.1;Benchmark Data Sets;449
4.6.3.2;Benchmark Algorithms;449
4.6.3.3;Performance Indicators;450
4.6.3.4;Numerical Results;451
4.6.3.5;Experimental Complexity of KEA and NSGA2;454
4.6.3.6;Scalability of KEA;457
4.6.3.7;Tri-Criterion Instances;458
4.6.4;Alternative Algorithms Based on KEA;459
4.6.5;Discussions and Analysis;464
4.6.6;Conclusions;465
4.6.7;References;466
4.7;Loss-Based Estimation with Evolutionary Algorithms and Cross-Validation;470
4.7.1;Introduction;470
4.7.2;Loss-Based Estimation with Cross-Validation;472
4.7.2.1;The Estimation Road Map;472
4.7.2.2;Estimator Selection Procedure;473
4.7.3;The Parameter Space for Polynomial Regression;474
4.7.3.1;Parametrization;474
4.7.3.2;Constraints;475
4.7.3.3;Size of the Parameter Space;476
4.7.4;Evolutionary Algorithms as Risk Optimization Procedures;479
4.7.4.1;Proposed EA;480
4.7.5;Simulation Studies;485
4.7.5.1;Simulation Study Design;485
4.7.5.2;Simulation Study Results;487
4.7.6;Data Analysis for a Diabetes Study;493
4.7.7;Conclusion;498
4.7.8;Appendix;499
4.7.9;References;500
5;Part III Real-World Applications;502
5.1;Particle Swarm Optimisation Aided MIMO Transceiver Designs;503
5.1.1;Introduction;503
5.1.2;Particle Swarm Optimisation;505
5.1.2.1;PSO Algorithm;506
5.1.2.2;Complexity of PSO Algorithm;508
5.1.2.3;Choice of PSO Algorithmic Parameters;508
5.1.3;PSO Aided Semi-blind Joint ML Estimation;509
5.1.3.1;MIMO System Model;509
5.1.3.2;Semi-blind Joint ML Channel Estimation and Data Detection;510
5.1.3.3;PSO Aided Semi-blind Joint ML Scheme;512
5.1.3.4;Simulation Study;513
5.1.4;PSO Based MBER Multiuser Transmitter Design;515
5.1.4.1;Downlink of SDMA Induced MIMO System;516
5.1.4.2;MBER MUT Design;517
5.1.4.3;PSO Aided MBER MUT Design;518
5.1.4.4;Simulation Study;519
5.1.5;Conclusions;524
5.1.6;References;524
5.2;Optimal Design of a Common Rail Diesel Engine Piston;528
5.2.1;Presentation;528
5.2.2;Introduction;529
5.2.3;Automatic Definition of the Computational Mesh;531
5.2.4;Single-objective and Multi-objective Approaches;537
5.2.5;Optimization Algorithms;539
5.2.5.1;Implementations on Computation Platform and Grids;540
5.2.6;Test Case;541
5.2.6.1;Problem Specifications;542
5.2.6.2;Engine Simulation Code;543
5.2.6.3;HIPEGEO;544
5.2.6.4;Required Computational Time;549
5.2.6.5;Analysis of the Results;549
5.2.6.6;Conclusions;552
5.2.7;References;554
5.3;Robust Preliminary Space Mission Design under Uncertainty;557
5.3.1;Introduction;557
5.3.2;Uncertainties in Space Mission Design;559
5.3.3;Modelling Uncertainties through Evidence Theory;560
5.3.3.1;Frame of Discernment U, Power Set 2U and Basic Probability Assignment;560
5.3.3.2;Belief and Plausibility Functions;562
5.3.3.3;Cumulative Functions: CBF, CCBF, CPF, CCPF;563
5.3.4;Solution of the OUU Problem;565
5.3.4.1;Problem Formulation;565
5.3.4.2;Direct Solution through a Population-Based Genetic Algorithm;567
5.3.4.3;Indirect Solution Approach;569
5.3.5;A Case Study: The BepiColombo Mission;573
5.3.5.1;Spacecraft Mass Model;573
5.3.5.2;The BPA Structure;576
5.3.6;Results and Comparisons;577
5.3.6.1;Direct Solution Simulations;578
5.3.6.2;Indirect Solution Simulations;580
5.3.7;Conclusions;582
5.3.8;References;582
5.4;Progressive Design Methodology for Design of Engineering Systems;585
5.4.1;Introduction;585
5.4.2;Progressive Design Methodology;587
5.4.3;Synthesis Phase of PDM;589
5.4.3.1;System Requirements Analysis;590
5.4.3.2;Definition of System Boundaries;591
5.4.3.3;Determination of Performance Criterion/Criteria;591
5.4.3.4;Selection of Variables and Sensitivity Analysis;592
5.4.3.5;Development of System Model;593
5.4.3.6;Deciding on the Optimization Strategy;594
5.4.4;Intermediate Analysis Phase of PDM;596
5.4.4.1;Identification of New Set of Criteria;598
5.4.4.2;Linguistic Term Set;598
5.4.4.3;Semantic of Linguistic Term Set;599
5.4.4.4;Aggregation Operator for Linguistic Weighted Information;599
5.4.5;Final Analysis Phase of PDM;603
5.4.6;Model Suitable for PDM;603
5.4.7;Synthesis Phase of PDM for Design of a BLDC Motor Drive;604
5.4.7.1;Requirement Analysis;604
5.4.7.2;Definition of System Boundaries;605
5.4.7.3;Determining of Performance Criteria;605
5.4.7.4;Development of System Model;606
5.4.7.5;Optimisation Strategy;608
5.4.7.6;Results of Multiobjective Optimisation;608
5.4.8;Intermediate Analysis Phase of PDM for Design of a BLDC Motor Drive;609
5.4.8.1;Identification of New Set of Objectives;611
5.4.8.2;Linguistic Term Set;612
5.4.8.3;The Semantic of Linguistic Term Set;612
5.4.8.4;Aggregation Operator for Linguistic Weighted Information;613
5.4.8.5;The Screening Process;613
5.4.9;Final Analysis Phase of PDM for Design of a BLDC Motor Drive;614
5.4.9.1;Detailed Simulation Model;614
5.4.9.2;Independent Design Variables and Objectives;614
5.4.9.3;Set of Solutions;616
5.4.10;Conclusions;618
5.4.11;References;619
5.5;Reliable Network Design Using Hybrid Genetic Algorithm Based on Multi-Ring Encoding;622
5.5.1;Reliable Network Design;622
5.5.2;Problem Formulation;624
5.5.3;Network Reliability;625
5.5.3.1;Reliability Metrics;625
5.5.3.2;Reliability Estimation;626
5.5.4;Previous Works;628
5.5.4.1;Genetic Algorithm;628
5.5.4.2;Ant Colony Optimization;628
5.5.4.3;Hybrid Heuristics;629
5.5.5;Solution Representation;630
5.5.5.1;Multi-Ring Encoding;630
5.5.5.2;Contraction Model;631
5.5.6;Hybrid Genetic Algorithm;633
5.5.6.1;Representation and Initialization;634
5.5.6.2;Repair;635
5.5.6.3;Fitness Evaluation;635
5.5.6.4;Parent Selection and Offspring Generation;636
5.5.6.5;Mutation;637
5.5.6.6;Local Search Ant Colony System;638
5.5.6.7;Selection for New Generation;640
5.5.7;Numerical Results;640
5.5.8;References;645
5.6;Isolated Word Analysis Using Biologically-Based Neural Networks;649
5.6.1;Neural Engineering Speech Recognition Using the Genetic Algorithm;649
5.6.2;Input Description;654
5.6.3;Synapse Connectivity: The Dynamic Synapse Neural Network;656
5.6.3.1;DSNN Architecture;658
5.6.3.2;Synapse Functionality;659
5.6.4;Synapse Optimization;662
5.6.4.1;Biased Selection: Word Micro-environment;663
5.6.4.2;Objective Function: Fitness Evaluation;664
5.6.4.3;Score Function Rational;666
5.6.4.4;Genetic Algorithm: Mutation, Elitism, and Micro-environment;669
5.6.5;Output Description and Analysis;670
5.6.5.1;Integrated Responses;670
5.6.5.2;Dynamic Model Emergence Via Cost-Weighted Classification;673
5.6.5.3;System Output Visualization;676
5.6.6;Discussion;677
5.6.6.1;Sound Processing Via the Dorsal Cochlear Nucleus;679
5.6.7;References;680
5.7;A Distributed Evolutionary Approach to Subtraction Radiography;683
5.7.1;Introduction;683
5.7.2;Background;685
5.7.2.1;The Image Registration Problem;685
5.7.2.2;High Performance Computing;688
5.7.3;A Grid Computing Framework for Medical Imaging;690
5.7.4;Case Study: Automatic Subtraction Radiography Using Distributed Evolutionary Algorithms;692
5.7.4.1;Problem Statement;693
5.7.4.2;Parametric Transformations;693
5.7.4.3;Similarity Measure;694
5.7.4.4;Optimization Problem;696
5.7.4.5;Interpolation Approach;696
5.7.4.6;Search Strategy;696
5.7.4.7;Algorithms Distribution;698
5.7.4.8;Algorithms Validation;699
5.7.4.9;The Subtraction Service;704
5.7.5;Discussion and Conclusions;706
5.7.6;Appendix;707
5.7.7;References;710
5.8;Speeding-Up Expensive Evaluations in High-Level Synthesis Using Solution Modeling and Fitness Inheritance;713
5.8.1;Introduction;713
5.8.2;Related Work;715
5.8.3;Design Space Exploration with Multi-Objective Evolutionary Computation;716
5.8.3.1;Design Space Exploration Core;717
5.8.3.2;Genetic Algorithm Design;718
5.8.3.3;Performance Measure;719
5.8.4;Solution Evaluation;719
5.8.5;Building Cost Models;721
5.8.5.1;Cost Models;722
5.8.5.2;Experimental Evaluation;723
5.8.5.2.1;Accuracy of Models;724
5.8.5.2.2;Performance of the Methodology;725
5.8.6;Fitness Inheritance;726
5.8.6.1;Inheritance Model;727
5.8.6.2;Experimental Evaluation;728
5.8.6.2.1;Weighting Functions;729
5.8.6.2.2;Parameter Analysis;732
5.8.7;Conclusions;733
5.8.8;References;733
6;Index;736



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.