E-Book, Englisch, 770 Seiten
Reihe: Wireless Networks
Wu Cyberspace Mimic Defense
1. Auflage 2019
ISBN: 978-3-030-29844-9
Verlag: Springer International Publishing
Format: PDF
Kopierschutz: 1 - PDF Watermark
Generalized Robust Control and Endogenous Security
E-Book, Englisch, 770 Seiten
Reihe: Wireless Networks
ISBN: 978-3-030-29844-9
Verlag: Springer International Publishing
Format: PDF
Kopierschutz: 1 - PDF Watermark
This book discusses uncertain threats, which are caused by unknown attacks based on unknown vulnerabilities or backdoors in the information system or control devices and software/hardware. Generalized robustness control architecture and the mimic defense mechanisms are presented in this book, which could change 'the easy-to-attack and difficult-to-defend game' in cyberspace. The endogenous uncertain effects from the targets of the software/hardware based on this architecture can produce magic 'mimic defense fog', and suppress in a normalized mode random disturbances caused by physical or logic elements, as well as effects of non-probability disturbances brought by uncertain security threats.
Although progress has been made in the current security defense theories in cyberspace and various types of security technologies have come into being, the effectiveness of such theories and technologies often depends on the scale of the prior knowledge of the attackers, on the part of the defender and on the acquired real-timing and accuracy regarding the attackers' behavior features and other information. Hence, there lacks an efficient active defense means to deal with uncertain security threats from the unknown. Even if the bottom-line defense technologies such as encrypted verification are adopted, the security of hardware/software products cannot be quantitatively designed, verified or measured. Due to the 'loose coupling' relationship and border defense modes between the defender and the protected target, there exist insurmountable theoretical and technological challenges in the protection of the defender and the target against the utilization of internal vulnerabilities or backdoors, as well as in dealing with attack scenarios based on backdoor-activated collaboration from both inside and outside, no matter how augmented or accumulated protective measures are adopted. Therefore, it is urgent to jump out of the stereotyped thinking based on conventional defense theories and technologies, find new theories and methods to effectively reduce the utilization of vulnerabilities and backdoors of the targets without relying on the priori knowledge and feature information, and to develop new technological means to offset uncertain threats based on unknown vulnerabilities and backdoors from an innovative perspective.
This book provides a solution both in theory and engineering implementation to the difficult problem of how to avoid the uncontrollability of product security caused by globalized marketing, COTS and non-trustworthy software/hardware sources. It has been proved that this revolutionary enabling technology has endowed software/hardware products in IT/ICT/CPS with endogenous security functions and has overturned the attack theories and methods based on hardware/software design defects or resident malicious codes.
This book is designed for educators, theoretical and technological researchers in cyber security and autonomous control and for business technicians who are engaged in the research on developing a new generation of software/hardware products by using endogenous security enabling technologies and for other product users. Postgraduates in IT/ICT/CPS/ICS will discover that (as long as the law of 'structure determines the nature and architecture determines the security is properly used), the problem of software/hardware design defects or malicious code embedding will become the swelling of Achilles in the process of informationization and will no longer haunt Pandora's box in cyberspace. Security and opening-up, advanced progressiveness and controllability seem to be contradictory, but there can be theoretically and technologically unified solutions to the problem.
Jiangxing WU serves as the Director of China National Digital Switching System Engineering & Technological R&D Center. He was elected as a Fellow of China Academy of Engineering in 2003. As a renowned expert in information & communication and network switching in China, he has played an important role in China and worked as the Vice Director in the communication section and the Vice Director of the Expert Board in the information section in the 8th, 9th, 10th and 11th Five-year Plans of China National High-tech R&D Program (863 Program); he has been the General Director of the High Speed Information Demonstration Network(CAINONet) , 3TNet, the Next Generation Broadcasting Network (NGB) and the New Concept High-efficient Computer System and Architecture R&D. He took charge of the New Generation High Credibility Network and Flexible Reconfiguration Network and served as the Director of the Technical Board of the Mobile Communication for the National Key Technologies R&D Program and the First Vice Director of the Expert Board of the National Tri-network Convergence Committee. From 1990s, after the great success of the first high capacity Digital SPC Switching System in China, Jiangxing Wu successively presided over the development of the first high-speed core router in China, the world's first massive Access Convergence Router (ACR) and information communication core infrastructure of the Flexible Reconfiguration Network. In 2013 he first launched the high-efficient computer prototype based on Mimic Computing and the theory of Cyberspace Mimic Defense, which went successfully through the test and assessment in 2016.He was awarded the First Prize for National Science and Technological Progress for three times and the Second Prize for the National Science and Technological Progress for four times in addition to the First Prize of the National Teaching Achievement Award. He received the Prize for Scientific and Technological Progress from Ho Leung Ho Lee Foundation in 1995 and the Prize for Scientific and Technological Achievements from the same foundation in 2015. Wu's team was awarded four times with the First Prize of th National Science and Technology Progress Award, nine times with the Second Prize of the National Science and Technology Progress Award and recognized in honor as the Innovation Team of State Science and Technology Progress Award in 2015.
Autoren/Hrsg.
Weitere Infos & Material
1;Preface;6
2;Author’s Profile;13
3;Brief Introduction (Abstract);15
4;Preface;16
5;Acknowledgments;27
6;Contents;29
7;Abbreviations;40
8;Part I;48
8.1;Chapter 1: Security Risks from Vulnerabilities and Backdoors;49
8.1.1;1.1 Harmfulness of Vulnerabilities and Backdoors;49
8.1.1.1;1.1.1 Related Concepts;52
8.1.1.2;1.1.2 Basic Topics of Research;53
8.1.1.2.1;1.1.2.1 Accurate Definition of Vulnerability;53
8.1.1.2.2;1.1.2.2 Reasonable Classification of Vulnerabilities;54
8.1.1.2.3;1.1.2.3 Unpredictability of Vulnerabilities;55
8.1.1.2.4;1.1.2.4 Elimination of Vulnerabilities;56
8.1.1.3;1.1.3 Threats and Impacts;56
8.1.1.3.1;1.1.3.1 Broad Security Threats;56
8.1.2;1.2 Inevitability of Vulnerabilities and Backdoors;62
8.1.2.1;1.2.1 Unavoidable Vulnerabilities and Backdoors;63
8.1.2.1.1;1.2.1.1 The Contradiction Between Complexity and Verifiability;64
8.1.2.1.2;1.2.1.2 Challenges in Supply Chain Management;65
8.1.2.1.3;1.2.1.3 Inadequacy of Current Theories and Engineering Techniques;67
8.1.2.2;1.2.2 Contingency of Vulnerability Emergence;69
8.1.2.2.1;1.2.2.1 Contingent Time of Discovery;69
8.1.2.2.2;1.2.2.2 Contingent Form of Emergence;71
8.1.2.3;1.2.3 The Temporal and Spatial Characteristic of Cognition;72
8.1.2.3.1;1.2.3.1 From Quantitative Changes to Qualitative Changes;72
8.1.2.3.2;1.2.3.2 Absolute and Relative Interdependence and Conversion;73
8.1.2.3.3;1.2.3.3 The Unity of Specificity and Generality;74
8.1.3;1.3 The Challenge of Defense Against Vulnerabilities and Backdoors;75
8.1.3.1;1.3.1 Major Channels for Advanced Persistent Threat (APT) Attacks;75
8.1.3.2;1.3.2 Uncertain Unknown Threats;75
8.1.3.3;1.3.3 Limited Effect of Traditional “Containment and Repair”;77
8.1.3.3.1;1.3.3.1 Reduce the Introduction of Vulnerabilities into Software Development, but Oversights Are Inevitable;77
8.1.3.3.2;1.3.3.2 Discovering Vulnerabilities in the Testing Phase, but New Ones Are Emerging;78
8.1.3.3.3;1.3.3.3 Exploit Mitigation Measures Keep Improving, but the Confrontation Never Stops;79
8.1.3.3.4;1.3.3.4 The Careful Designing of White List Detection Mechanisms for System Protection Fails to Prevent Bypassing from Taking Place from Time to Time;80
8.1.4;1.4 Inspirations and Reflection;80
8.1.4.1;1.4.1 Building a System Based on “Contamination”;81
8.1.4.2;1.4.2 From Component Credibility to Structure Security;81
8.1.4.3;1.4.3 From Reducing Exploitability to Destroying Accessibility;81
8.1.4.4;1.4.4 Transforming the Problematic Scenarios;82
8.1.5;References;83
8.2;Chapter 2: Formal Description of Cyber Attacks;85
8.2.1;2.1 Formal Description Methods of Conventional Cyber Attacks;86
8.2.1.1;2.1.1 Attack Tree;86
8.2.1.2;2.1.2 Attack Graph;88
8.2.1.3;2.1.3 Analysis of Several Attack Models;90
8.2.2;2.2 The AS Theory;91
8.2.2.1;2.2.1 The AS Model;92
8.2.2.2;2.2.2 Defects in the AS Theory;94
8.2.3;2.3 The MAS;95
8.2.3.1;2.3.1 Definition and Nature of the MAS;95
8.2.3.2;2.3.2 MAS Implementation Methods;96
8.2.3.3;2.3.3 Limitations of the MAS;97
8.2.4;2.4 New Methods of Formal Description of Cyber Attacks;98
8.2.4.1;2.4.1 Cyber Attack Process;98
8.2.4.2;2.4.2 Formal Description of the Attack Graph;100
8.2.4.3;2.4.3 Formal Description of an Attack Chain;101
8.2.4.4;2.4.4 Vulnerability Analysis of Cyber Attack Chains;102
8.2.4.4.1;2.4.4.1 Conditions for the Successful Implementation of Atomic Attacks;103
8.2.4.4.2;2.4.4.2 The Conditions on Which the Successfully Completed Attack Chain Depends;106
8.2.5;References;111
8.3;Chapter 3: Conventional Defense Technologies;112
8.3.1;3.1 Static Defense Technology;112
8.3.1.1;3.1.1 Overview of Static Defense Technology;112
8.3.1.2;3.1.2 Analysis of Static Defense Technology;113
8.3.1.2.1;3.1.2.1 Firewall Technology;113
8.3.1.2.2;3.1.2.2 Intrusion Detection Technology;115
8.3.1.2.3;3.1.2.3 Intrusion Prevention Technology;117
8.3.1.2.4;3.1.2.4 Vulnerability Scanning Technology;119
8.3.2;3.2 Honeypot;121
8.3.2.1;3.2.1 Network Intrusion and Malicious Code Detection;122
8.3.2.2;3.2.2 Capturing Samples of Malicious Codes;123
8.3.2.3;3.2.3 Tracking and Analysis of Security Threats;124
8.3.2.4;3.2.4 Extraction of Attack Features;124
8.3.2.5;3.2.5 Limitations of Honeypot;125
8.3.3;3.3 Collaborative Defense;126
8.3.3.1;3.3.1 Collaborative Defense Between Intrusion Detection and Firewall;127
8.3.3.2;3.3.2 Collaborative Defense Between Intrusion Prevention and Firewall Systems;128
8.3.3.3;3.3.3 Collaborative Defense Between the Intrusion Prevention System and Intrusion Detection System;129
8.3.3.4;3.3.4 Collaborative Defense Between Intrusion Prevention and Vulnerability Scanning Systems;130
8.3.3.5;3.3.5 Collaborative Defense Between the Intrusion Prevention System and Honeypot;130
8.3.4;3.4 Intrusion Tolerance Technology;132
8.3.4.1;3.4.1 Technical Principles of Intrusion Tolerance;132
8.3.4.1.1;3.4.1.1 Theoretical Model;133
8.3.4.1.2;3.4.1.2 Mechanisms and Strategies;133
8.3.4.2;3.4.2 Two Typical Intrusion Tolerance Systems;136
8.3.4.2.1;3.4.2.1 Scalable Intrusion-Tolerant Architecture;136
8.3.4.2.2;3.4.2.2 Malicious and Accidental Fault Tolerance for Internet Applications [75];137
8.3.4.3;3.4.3 Comparison of Web Intrusion Tolerance Architectures (Table 3.1);139
8.3.4.4;3.4.4 Differences Between Intrusion Tolerance and Fault Tolerance;140
8.3.5;3.5 Sandbox Acting as an Isolation Defense;142
8.3.5.1;3.5.1 Overview of Sandbox;142
8.3.5.2;3.5.2 Theoretical Principles of Sandbox;144
8.3.5.2.1;3.5.2.1 Application Layer Sandbox;144
8.3.5.2.2;3.5.2.2 Kernel Layer Sandbox;145
8.3.5.2.3;3.5.2.3 Hybrid Sandbox;145
8.3.5.3;3.5.3 Status Quo of Sandbox Defense Technology;145
8.3.6;3.6 Computer Immune Technology;147
8.3.6.1;3.6.1 Overview of Immune Technology;147
8.3.6.2;3.6.2 Artificial Immune System Status;148
8.3.7;3.7 Review of Conventional Defense Methods;151
8.3.8;References;154
8.4;Chapter 4: New Approaches to Cyber Defense;157
8.4.1;4.1 New Developments in Cyber Defense Technologies;157
8.4.2;4.2 Trusted Computing;160
8.4.2.1;4.2.1 Basic Thinking Behind Trusted Computing;160
8.4.2.2;4.2.2 Technological Approaches of Trusted Computing;161
8.4.2.2.1;4.2.2.1 Root of Trust;161
8.4.2.2.2;4.2.2.2 Trust Measurement Model and Chain of Trust;162
8.4.2.2.3;4.2.2.3 Trusted Computing Platform (TCP);164
8.4.2.3;4.2.3 New Developments in Trusted Computing;167
8.4.2.3.1;4.2.3.1 Trusted Computing 3.0;167
8.4.2.3.2;4.2.3.2 Trusted Cloud;169
8.4.2.3.3;4.2.3.3 SGX Architecture;171
8.4.3;4.3 Tailored Trustworthy Spaces;173
8.4.3.1;4.3.1 Preconditions;174
8.4.3.1.1;4.3.1.1 Communication;174
8.4.3.1.2;4.3.1.2 Computing;174
8.4.3.1.3;4.3.1.3 Security;176
8.4.3.1.4;4.3.1.4 Summary;176
8.4.3.2;4.3.2 Tailored Trustworthy Spaces (TTS);177
8.4.3.2.1;4.3.2.1 Features Research;177
8.4.3.2.2;4.3.2.2 Trust Negotiation;178
8.4.3.2.3;4.3.2.3 Set of Operations;178
8.4.3.2.4;4.3.2.4 Privacy;178
8.4.4;4.4 Mobile Target Defense;179
8.4.4.1;4.4.1 MTD Mechanism;180
8.4.4.1.1;4.4.1.1 Randomization;180
8.4.4.1.2;4.4.1.2 Diversification Mechanism;181
8.4.4.1.3;4.4.1.3 Dynamic Mechanism;181
8.4.4.1.4;4.4.1.4 Symbiotic Mechanism;182
8.4.4.2;4.4.2 Roadmap and Challenges of MTD;182
8.4.5;4.5 Blockchain;183
8.4.5.1;4.5.1 Basic Concept;184
8.4.5.2;4.5.2 Core Technologies;185
8.4.5.3;4.5.3 Analysis of Blockchain Security;187
8.4.6;4.6 Zero Trust Security Model;188
8.4.6.1;4.6.1 Basic Concept;189
8.4.6.2;4.6.2 Forrrester’s Zero Trust Security Framework;190
8.4.6.3;4.6.3 Google’s Solution;191
8.4.6.3.1;4.6.3.1 Principles of Identifying Security Devices;192
8.4.6.3.2;4.6.3.2 Principles for Identifying Users’ Security;193
8.4.6.3.3;4.6.3.3 Removing Trust from the Network;193
8.4.6.3.4;4.6.3.4 Externalizing Applications and Workflows;193
8.4.6.3.5;4.6.3.5 Implementing Inventory-Based Access Control;194
8.4.7;4.7 Reflections on New Cyber Defense Technologies;194
8.4.8;References;199
8.5;Chapter 5: Analysis on Diversity, Randomness, and Dynameicity;202
8.5.1;5.1 Diversity;203
8.5.1.1;5.1.1 Overview;203
8.5.1.2;5.1.2 Diversity of the Executors;204
8.5.1.2.1;5.1.2.1 Executor Diversity in Network Operating Systems;205
8.5.1.2.2;5.1.2.2 Executor Diversity in the Path;205
8.5.1.3;5.1.3 Diversity of the Execution Space;208
8.5.1.3.1;5.1.3.1 Execution Space Diversity in Network Operating Systems;208
8.5.1.3.2;5.1.3.2 Execution Space Diversity in the Path;211
8.5.1.4;5.1.4 Differences Between Diversity and Pluralism;212
8.5.2;5.2 Randomness;213
8.5.2.1;5.2.1 Overview;213
8.5.2.2;5.2.2 Address Space Randomization;214
8.5.2.3;5.2.3 Instruction System Randomization;216
8.5.2.4;5.2.4 Kernel Data Randomization;218
8.5.2.5;5.2.5 Cost of Introduction;220
8.5.2.5.1;5.2.5.1 Different Software and Hardware Versions Require Different Expert Teams to Design and Maintain;220
8.5.2.5.2;5.2.5.2 The Cost Will Inevitably Increase if a Multi-version Service System Is Constructed;222
8.5.2.5.3;5.2.5.3 Introduction of Diversity Makes Multi-version Synchronized Updating a New Challenge;223
8.5.3;5.3 Dynamicity;224
8.5.3.1;5.3.1 Overview;224
8.5.3.1.1;5.3.1.1 Resource Redundancy Configuration;226
8.5.3.1.2;5.3.1.2 Cost of Randomness;227
8.5.3.1.3;5.3.1.3 Cost of Effectiveness;228
8.5.3.2;5.3.2 Dynamic Defense Technology;228
8.5.3.2.1;5.3.2.1 Dynamic Network;229
8.5.3.2.2;5.3.2.2 Dynamic Platform;231
8.5.3.2.3;5.3.2.3 Dynamic Software;233
8.5.3.2.4;5.3.2.4 Dynamic Data;235
8.5.3.3;5.3.3 Dynamicity Challenges;236
8.5.4;5.4 Case of OS Diversity Analysis;237
8.5.4.1;5.4.1 Statistical Analysis Data Based on the NVD;238
8.5.4.2;5.4.2 Common OS Vulnerabilities;239
8.5.4.3;5.4.3 Conclusions;243
8.5.5;5.5 Chapter Summary;245
8.5.6;References;247
8.6;Chapter 6: Revelation of the Heterogeneous Redundancy Architecture;249
8.6.1;6.1 Introduction;249
8.6.2;6.2 Addressing the Challenge of Uncertain Failures;251
8.6.2.1;6.2.1 Proposal of the Problem;251
8.6.2.2;6.2.2 Enlightenment from TRA;252
8.6.2.3;6.2.3 Formal Description of TRA;254
8.6.3;6.3 The Role of Redundancy and Heterogeneous Redundancy;256
8.6.3.1;6.3.1 Redundancy and Fault Tolerance;256
8.6.3.2;6.3.2 Endogenous Functions and Structural Effects;258
8.6.3.3;6.3.3 Redundancy and Situational Awareness;258
8.6.3.4;6.3.4 From Isomorphism to Heterogeneity;259
8.6.3.4.1;6.3.4.1 Isomorphic Redundancy;259
8.6.3.4.2;6.3.4.2 Heterogeneous Redundancy;260
8.6.3.4.3;6.3.4.3 Appropriate Functional Intersections;261
8.6.3.5;6.3.5 Relationship Between Fault Tolerance and Intrusion Tolerance;262
8.6.4;6.4 Voting and Ruling;263
8.6.4.1;6.4.1 Majority Voting and Consensus Mechanism;263
8.6.4.2;6.4.2 Multimode Ruling;264
8.6.5;6.5 Dissimilar Redundancy Structure;265
8.6.5.1;6.5.1 Analysis of the Intrusion Tolerance Properties of the DRS;269
8.6.5.2;6.5.2 Summary of the Endogenous Security Effects of the DRS;273
8.6.5.3;6.5.3 Hierarchical Effect of Heterogeneous Redundancy;274
8.6.5.4;6.5.4 Systematic Fingerprint and Tunnel-Through;276
8.6.5.5;6.5.5 Robust Control and General Uncertain Disturbances;277
8.6.6;6.6 Anti-attack Modeling;281
8.6.6.1;6.6.1 The GSPN Model;282
8.6.6.2;6.6.2 Anti-attack Considerations;283
8.6.6.3;6.6.3 Anti-attack Modeling;286
8.6.7;6.7 Anti-aggression Analysis;288
8.6.7.1;6.7.1 Anti-general Attack Analysis;288
8.6.7.1.1;6.7.1.1 Non-redundant System;288
8.6.7.1.2;6.7.1.2 Dissimilar Redundant System;291
8.6.7.2;6.7.2 Anti-special Attack Analysis;300
8.6.7.2.1;6.7.2.1 Non-redundant System;300
8.6.7.2.2;6.7.2.2 Dissimilar Redundant System;301
8.6.7.3;6.7.3 Summary of the Anti-attack Analysis;306
8.6.8;6.8 Conclusion;308
8.6.8.1;6.8.1 Conditional Awareness of Uncertain Threats;308
8.6.8.2;6.8.2 New Connotations of General Robust Control;308
8.6.8.3;6.8.3 DRS Intrusion Tolerance Defect;309
8.6.8.4;6.8.4 DRS Transformation Proposals;311
8.6.9;References;313
8.7;Chapter 7: DHR Architecture;314
8.7.1;7.1 Dynamic Heterogeneous Redundant Architecture;315
8.7.1.1;7.1.1 Basic Principles of DHRA;316
8.7.1.1.1;7.1.1.1 Assumed Conditions;316
8.7.1.1.2;7.1.1.2 Composition and Functions;317
8.7.1.1.3;7.1.1.3 Core Mechanism;319
8.7.1.1.4;7.1.1.4 Robust Control and Problem Avoidance;320
8.7.1.1.5;7.1.1.5 Iterative Convergence;321
8.7.1.2;7.1.2 Goals and Effects of DHR;321
8.7.1.2.1;7.1.2.1 Killing Four Birds with One Stone;322
8.7.1.2.2;7.1.2.2 Dynamic Variability of the Apparent Structure;322
8.7.1.2.3;7.1.2.3 Equivalent to TRA with the Superposed-State Authentication Function;323
8.7.1.2.4;7.1.2.4 Metastable Scenarios and DRS Isomorphism;324
8.7.1.2.5;7.1.2.5 The Uncertainty Attribute;325
8.7.1.2.6;7.1.2.6 Coding Theory and Security Measurement;325
8.7.1.2.7;7.1.2.7 Endogenous Security Mechanism and Integrated Defense;326
8.7.1.2.8;7.1.2.8 Problem Avoidance and Problem Zeroing;327
8.7.1.3;7.1.3 Typical DHR Architecture;328
8.7.1.4;7.1.4 Atypical DHR Architecture;332
8.7.2;7.2 The Attack Surface of DHR;334
8.7.3;7.3 Functionality and Effectiveness;336
8.7.3.1;7.3.1 Creating a Cognition Dilemma for the Target Object;336
8.7.3.2;7.3.2 DFI to Present Uncertainty;337
8.7.3.3;7.3.3 Making It Difficult to Exploit the Loopholes of the Target Object;337
8.7.3.4;7.3.4 Increasing the Uncertainty for an Attack Chain;338
8.7.3.5;7.3.5 Increasing the Difficulty for MR Escape;339
8.7.3.6;7.3.6 Independent Security Gain;340
8.7.3.7;7.3.7 Strong Correlation Between the Vulnerability Value and the Environment;340
8.7.3.8;7.3.8 Making It Difficult to Create a Multi-target Attack Sequence;341
8.7.3.9;7.3.9 Measurable Generalized Dynamization;342
8.7.3.10;7.3.10 Weakening the Impact of Homologous Backdoors;342
8.7.4;7.4 Reflections on the Issues Concerned;343
8.7.4.1;7.4.1 Addressing Uncertain Threats with Endogenous Mechanisms;343
8.7.4.2;7.4.2 Reliability and Credibility Guaranteed by the Structural Gain;345
8.7.4.3;7.4.3 New Security-Trustable Methods and Approaches;345
8.7.4.4;7.4.4 Creating a New Demand in a Diversified Market;346
8.7.4.5;7.4.5 The Problem of Super Escape and Information Leaking;347
8.7.5;7.5 Uncertainty: An Influencing Factor;348
8.7.5.1;7.5.1 DHR Endogenous Factors;348
8.7.5.2;7.5.2 DHR-Introduced Factors;351
8.7.5.3;7.5.3 DHR-Combined Factors;351
8.7.5.4;7.5.4 Challenges to a Forced Breakthrough;352
8.7.6;7.6 Analogical Analysis Based on the Coding Theory;353
8.7.6.1;7.6.1 Coding Theory and Turbo Codes;353
8.7.6.2;7.6.2 Analogic Analysis Based on Turbo Encoding;356
8.7.6.2.1;7.6.2.1 Coding Heterogeneity;357
8.7.6.2.2;7.6.2.2 Coding Redundancy;359
8.7.6.2.3;7.6.2.3 Coding OV;361
8.7.6.2.4;7.6.2.4 Decoding and Ruling;361
8.7.6.2.5;7.6.2.5 Codec Dynamics;364
8.7.6.3;7.6.3 Some Insights;367
8.7.6.3.1;7.6.3.1 Randomness and Redundancy Serving as the Core Elements for Solving Cyberspace Security Problems;367
8.7.6.3.2;7.6.3.2 Uncertainty Effect Brought by DHRA;367
8.7.6.3.3;7.6.3.3 Flexibility and Self-restoring Capability of DHRA;368
8.7.6.3.4;7.6.3.4 Insufficiency in Analogical Analysis Using the Turbo Code Model;368
8.7.7;7.7 DHR-Related Effects;369
8.7.7.1;7.7.1 Ability to Perceive Unidentified Threats;369
8.7.7.2;7.7.2 Distributed Environmental Effect;369
8.7.7.3;7.7.3 Integrated Effect;370
8.7.7.4;7.7.4 Architecture-Determined Safety;370
8.7.7.5;7.7.5 Changing the Attack and Defense Game Rules in Cyberspace;371
8.7.7.6;7.7.6 Creating a Loose Ecological Environment;372
8.7.7.6.1;7.7.6.1 “Isomeric and Diversified” Ecology;373
8.7.7.6.2;7.7.6.2 New Ways to Accelerate Product Maturity;373
8.7.7.6.3;7.7.6.3 Self-controllable Complementary Form;373
8.7.7.6.4;7.7.6.4 Creating an Integrated Operating Environment;374
8.7.7.7;7.7.7 Restricted Application;374
8.7.7.7.1;7.7.7.1 Micro-synchronous Low-Time-Delay Operating Environment;375
8.7.7.7.2;7.7.7.2 Time-Delay-Constrained Scenarios That Cannot Be Corrected;375
8.7.7.7.3;7.7.7.3 Lack of a Normalizable Input/Output Interface;375
8.7.7.7.4;7.7.7.4 Lack of Heterogeneous Hardware/Software Resources;376
8.7.7.7.5;7.7.7.5 “Blackout” in Software Update;376
8.7.7.7.6;7.7.7.6 Cost-Sensitive Area;376
8.7.7.7.7;7.7.7.7 Concerns Regarding the Highly Robust Software Architecture;377
8.7.7.7.8;7.7.7.8 Issue of Ruling;377
8.7.8;References;378
9;Part II;379
9.1;Chapter 8: Original Meaning and Vision of Mimic Defense;380
9.1.1;8.1 Mimic Disguise and Mimic Defense;380
9.1.1.1;8.1.1 Biological Mimicry;380
9.1.1.2;8.1.2 Mimic Disguise;382
9.1.1.3;8.1.3 Two Basic Security Problems and Two Severe Challenges;384
9.1.1.4;8.1.4 An Entry Point: The Vulnerability of an Attack Chain;386
9.1.1.5;8.1.5 Build the Mimic Defense;387
9.1.1.6;8.1.6 Original Meaning of Mimic Defense;391
9.1.2;8.2 Mimic Computing and Endogenous Security;393
9.1.2.1;8.2.1 The Plight of HPC Power Consumption;393
9.1.2.2;8.2.2 Original Purpose of Mimic Calculation;394
9.1.2.3;8.2.3 Vision of Mimic Calculation;395
9.1.2.4;8.2.4 Variable Structure Calculation and Endogenous Security;399
9.1.3;8.3 Vision of Mimic Defense;400
9.1.3.1;8.3.1 Reversing the Easy-to-Attack and Hard-to-Defend Status;401
9.1.3.2;8.3.2 A Universal Structure and Mechanism;403
9.1.3.3;8.3.3 Separation of Robust Control and Service Functions;403
9.1.3.4;8.3.4 Unknown Threat Perception;404
9.1.3.5;8.3.5 A Diversified Eco-environment;405
9.1.3.6;8.3.6 Achievement of Multi-dimensional Goals;406
9.1.3.7;8.3.7 Reduce the Complexity of Security Maintenance;407
9.1.4;References;408
9.2;Chapter 9: The Principle of Cyberspace Mimic Defense;409
9.2.1;9.1 Overview;409
9.2.1.1;9.1.1 Core Ideology;410
9.2.1.2;9.1.2 Eradicating the Root Cause for Cyber Security Problems;411
9.2.1.3;9.1.3 Biological Immunity and Endogenous Security;412
9.2.1.3.1;9.1.3.1 Non-specific Immunity;413
9.2.1.3.2;9.1.3.2 Specific Immunity;414
9.2.1.3.3;9.1.3.3 Non-prior-Knowledge-Reliant Defense;415
9.2.1.3.4;9.1.3.4 Endogenous Security;415
9.2.1.4;9.1.4 Non-specific Surface Defense;417
9.2.1.5;9.1.5 Integrated Defense;417
9.2.1.6;9.1.6 GRC and the Mimic Structure;418
9.2.1.7;9.1.7 Goals and Expectations;419
9.2.1.7.1;9.1.7.1 Development Goals;419
9.2.1.7.2;9.1.7.2 Technical Expectations;423
9.2.1.8;9.1.8 Potential Application Targets;424
9.2.2;9.2 Cyberspace Mimic Defense;426
9.2.2.1;9.2.1 Underlying Theories and Basic Principles;428
9.2.2.1.1;9.2.1.1 FE Common Sense and the TRA;431
9.2.2.1.2;9.2.1.2 DHR Architecture;431
9.2.2.1.3;9.2.1.3 Security Effects Brought About by Endogenous Mechanisms;433
9.2.2.2;9.2.2 Mimic Defense System;434
9.2.2.2.1;9.2.2.1 The Main Concepts and Core Mechanisms of CMD;436
9.2.2.2.2;9.2.2.2 CMD Model;448
9.2.2.3;9.2.3 Basic Features and Core Processes;449
9.2.2.4;9.2.4 Connotation and Extension Technologies;455
9.2.2.4.1;9.2.4.1 Connotation Technologies;455
9.2.2.4.2;9.2.4.2 Extension Technologies;456
9.2.2.5;9.2.5 Summary and Induction;457
9.2.2.6;9.2.6 Discussions of the Related Issues;459
9.2.2.6.1;9.2.6.1 CMD Level Based on the Attack Effect;459
9.2.2.6.2;9.2.6.2 Measurement Based on Reliability Theories and Test Methods;460
9.2.2.6.3;9.2.6.3 Security Situation Monitoring of the Target System;461
9.2.2.6.4;9.2.6.4 Contrast Verification;462
9.2.2.6.5;9.2.6.5 Information Security Effect;462
9.2.2.6.6;9.2.6.6 Mimic Defense and Mimic Computation;463
9.2.2.6.7;9.2.6.7 Unknown Threat Detection Devices;464
9.2.2.6.8;9.2.6.8 “Halt-Restart” Bumps;465
9.2.2.6.9;9.2.6.9 Standby Cooperative Attacks and External Command Disturbances;465
9.2.2.6.10;9.2.6.10 Superimposable and Iterative;466
9.2.2.6.11;9.2.6.11 Granularity of the Target Object;466
9.2.2.6.12;9.2.6.12 Natural Scenarios of DHR;467
9.2.2.6.13;9.2.6.13 About the Side Channel Attack;467
9.2.3;9.3 Structural Representation and Mimic Scenarios;468
9.2.3.1;9.3.1 Uncertain Characterization of the Structure;468
9.2.3.2;9.3.2 Mimic Scenario Creation;470
9.2.3.3;9.3.3 Typical Mimic Scenarios;471
9.2.4;9.4 Mimic Display;473
9.2.4.1;9.4.1 Typical Modes of Mimic Display;473
9.2.4.2;9.4.2 Considerations of the MB Credibility;476
9.2.5;9.5 Anti-attack and Reliability Analysis;478
9.2.5.1;9.5.1 Overview;478
9.2.5.2;9.5.2 Anti-attack and Reliability Models;479
9.2.5.3;9.5.3 Anti-attack Analysis;483
9.2.5.3.1;9.5.3.1 Analysis of CMD’s Resistance Against the General DM/CM Attacks;501
9.2.5.3.2;9.5.3.2 Anti-special Attack Analysis of the CMD System;505
9.2.5.3.3;9.5.3.3 Summary of the Anti-attack Analysis;514
9.2.5.4;9.5.4 Reliability Analysis;518
9.2.5.5;9.5.5 Conclusion;525
9.2.6;9.6 Differences Between CMD and HIT (Heterogeneous Intrusion Tolerance);526
9.2.6.1;9.6.1 Major Differences;526
9.2.6.2;9.6.2 Prerequisites and Functional Differences;528
9.2.6.3;9.6.3 Summary;529
9.2.7;References;530
9.3;Chapter 10: Engineering and Implementation of Mimic Defense;532
9.3.1;10.1 Basic Conditions and Constraints;532
9.3.1.1;10.1.1 Basic Conditions;532
9.3.1.2;10.1.2 Constraints;533
9.3.2;10.2 Main Realization Mechanisms;534
9.3.2.1;10.2.1 Structural Effect and Functional Convergence Mechanism;535
9.3.2.2;10.2.2 One-Way or Unidirectional Connection Mechanism;535
9.3.2.3;10.2.3 Policy and Schedule Mechanism;536
9.3.2.4;10.2.4 Mimic Ruling Mechanism;537
9.3.2.5;10.2.5 Negative Feedback Control Mechanism;537
9.3.2.6;10.2.6 Input Allocation and Adaptation Mechanism;538
9.3.2.7;10.2.7 Output Agency and Normalization Mechanism;538
9.3.2.8;10.2.8 Sharding/Fragmentation Mechanism;539
9.3.2.9;10.2.9 Randomization/Dynamization/Diversity Mechanism;539
9.3.2.10;10.2.10 Virtualization Mechanism;540
9.3.2.11;10.2.11 Iteration and Superposition Mechanism;541
9.3.2.12;10.2.12 Software Fault Tolerance Mechanism;542
9.3.2.13;10.2.13 Dissimilarity Mechanism;543
9.3.2.14;10.2.14 Reconfiguration Mechanism;544
9.3.2.15;10.2.15 Executor’s Cleaning and Recovery Mechanism;544
9.3.2.16;10.2.16 Diversified Compilation Mechanism;546
9.3.2.17;10.2.17 Mimic Structure Programming;547
9.3.3;10.3 Major Challenges to Engineering Implementation;548
9.3.3.1;10.3.1 Best Match of Function Intersection;548
9.3.3.2;10.3.2 Complexity of Multimode Ruling;549
9.3.3.3;10.3.3 Service Turbulence;550
9.3.3.4;10.3.4 The Use of Open Elements;551
9.3.3.5;10.3.5 Execution Efficiency of Mimic Software;552
9.3.3.6;10.3.6 Diversification of Application Programs;553
9.3.3.7;10.3.7 Mimic Defense Interface Configuration;555
9.3.3.7.1;10.3.7.1 Route Forwarding Based on Mimic Defense;555
9.3.3.7.2;10.3.7.2 Mimic Defense-Based Web Access Server;555
9.3.3.7.3;10.3.7.3 File Storage System Based on Mimic Defense;556
9.3.3.7.4;10.3.7.4 Mimic Defense-Based Domain Name Resolution;556
9.3.3.7.5;10.3.7.5 Mimic Defense-Based Gun Control System;557
9.3.3.8;10.3.8 Version Update;557
9.3.3.9;10.3.9 Loading of Non-cross-Platform Application;558
9.3.3.10;10.3.10 Re-synchronization and Environment Reconstruction;559
9.3.3.11;10.3.11 Simplifying Complexity of Heterogeneous Redundancy Realization;560
9.3.3.11.1;10.3.11.1 Commercial Obstacles to Heterogeneous Redundancy;560
9.3.3.11.2;10.3.11.2 Locking the Robustness of the Service;561
9.3.3.11.3;10.3.11.3 Achieving Layered Heterogeneous Redundancy;562
9.3.3.11.4;10.3.11.4 SGX and the Protection of Heterogeneous Redundant Code and Data;562
9.3.3.11.5;10.3.11.5 Avoiding the “Absolutely Trustworthy” Trap of SGX;563
9.3.4;10.4 Testing and Evaluation of Mimic Defense;564
9.3.4.1;10.4.1 Analysis of Mimic Defense Effects;564
9.3.4.1.1;10.4.1.1 Definitive Defense Effect Within the Interface;564
9.3.4.1.2;10.4.1.2 Uncertain Defense Effect on or Outside the Interface;565
9.3.4.1.3;10.4.1.3 Uncertain Defense Effect Against Front Door Problems;565
9.3.4.1.4;10.4.1.4 Uncertain Social Engineering Effects;566
9.3.4.2;10.4.2 Reference Perimeter of Mimic Defense Effects;567
9.3.4.2.1;10.4.2.1 Ideal Effects of Mimic Defense;568
9.3.4.2.2;10.4.2.2 Reference Range of Defense Effect;568
9.3.4.3;10.4.3 Factors to Be Considered in Mimic Defense Verification and Test;570
9.3.4.3.1;10.4.3.1 Background of Testing;571
9.3.4.3.2;10.4.3.2 Principles of Testing;572
9.3.4.3.3;10.4.3.3 Major Testing Indicators;574
9.3.4.3.4;10.4.3.4 Considerations of Test Methods;579
9.3.4.3.5;10.4.3.5 Qualitative Analysis of Defense Effectiveness;582
9.3.4.4;10.4.4 Reflections on Quasi-stealth Evaluation;582
9.3.4.5;10.4.5 Mimic Ruling-Based Measurable Review;583
9.3.4.6;10.4.6 Mimic Defense Benchmark Function Experiment;585
9.3.4.7;10.4.7 Attackers’ Perspective;593
9.3.4.7.1;10.4.7.1 Mining or Setting up Vulnerabilities/Backdoors in the Mimic Interface;593
9.3.4.7.2;10.4.7.2 Creating a Homologous Ecosystem with the Development Tools and the Open-Source Community Model;594
9.3.4.7.3;10.4.7.3 Black-Box Operations Using “Irreplaceable” Advantage;594
9.3.4.7.4;10.4.7.4 Developing Attack Codes that Are Not Dependent on the Environment;594
9.3.4.7.5;10.4.7.5 Coordinated Operation Under Non-cooperative Conditions Using Input Sequence;595
9.3.4.7.6;10.4.7.6 Trying to Bypass the Mimic Interface;595
9.3.4.7.7;10.4.7.7 Attacking the Mimic Control Aspect;595
9.3.4.7.8;10.4.7.8 DDoS Brute Force Attacks;596
9.3.4.7.9;10.4.7.9 Social Engineering-Based Attacks;596
9.3.4.7.10;10.4.7.10 Directly Cracking Access Command or Password;596
9.3.5;References;597
9.4;Chapter 11: Foundation and Cost of Mimic Defense;598
9.4.1;11.1 Foundation for Mimic Defense Realization;598
9.4.1.1;11.1.1 Era of Weak Correlation of Complexity to Cost;598
9.4.1.2;11.1.2 High Efficiency Computing and Heterogeneous Computing;599
9.4.1.3;11.1.3 Diversified Ecological Environment;601
9.4.1.4;11.1.4 Standardization and Open Architecture;602
9.4.1.5;11.1.5 Virtualization Technology;603
9.4.1.6;11.1.6 Reconfiguration and Reorganization;604
9.4.1.7;11.1.7 Distributed and Cloud Computing Service;605
9.4.1.8;11.1.8 Dynamic Scheduling;607
9.4.1.9;11.1.9 Feedback Control;608
9.4.1.10;11.1.10 Quasi-Trusted Computing;608
9.4.1.11;11.1.11 Robust Control;609
9.4.1.12;11.1.12 New Developments of System Structure Technologies;609
9.4.2;11.2 Analysis of Traditional Technology Compatibility;610
9.4.2.1;11.2.1 Naturally Accepting Traditional Security Technologies;610
9.4.2.2;11.2.2 Naturally Carrying Forward the Hardware Technological Advances;612
9.4.2.3;11.2.3 Strong Correlation to Software Technological Development;613
9.4.2.4;11.2.4 Depending on the Open and Plural Ecological Environment;613
9.4.3;11.3 Cost of Mimic Defense Implementation;613
9.4.3.1;11.3.1 Cost of Dynamicity;614
9.4.3.2;11.3.2 Cost of Heterogeneity;614
9.4.3.3;11.3.3 Cost of Redundancy;616
9.4.3.4;11.3.4 Cost of Cleanup and Reconfiguration;616
9.4.3.5;11.3.5 Cost of Virtualization;617
9.4.3.6;11.3.6 Cost of Synchronization;617
9.4.3.7;11.3.7 Cost of Ruling;618
9.4.3.7.1;11.3.7.1 Synchronous Judgment;619
9.4.3.7.2;11.3.7.2 Agreed Output;619
9.4.3.7.3;11.3.7.3 First Come, First Output;619
9.4.3.7.4;11.3.7.4 Regular Judgment;619
9.4.3.7.5;11.3.7.5 Mask Decision;620
9.4.3.7.6;11.3.7.6 Normalized Pretreatment;620
9.4.3.8;11.3.8 Cost of Input/Output Agency;620
9.4.3.9;11.3.9 Cost of One-Way Connection;621
9.4.4;11.4 Scientific and Technological Issues to Be Studied and Solved;622
9.4.4.1;11.4.1 Scientific Issues Needing Urgent Study in the CMD Field;622
9.4.4.2;11.4.2 Engineering and Technical Issues Needing Urgent Solution in the CMD Field;623
9.4.4.2.1;11.4.2.1 Dissimilarity Design and Screening Theory;623
9.4.4.2.2;11.4.2.2 Pluralistic and Diversified Engineering Issues;624
9.4.4.2.3;11.4.2.3 Assessing the Security Impact of the “Homologous” Component Vulnerability on the DHR Architecture;625
9.4.4.2.4;11.4.2.4 How to Establish a System Design Reference Model;625
9.4.4.2.5;11.4.2.5 How to Prevent Standby Attacks;625
9.4.4.2.6;11.4.2.6 Mimic Ruling;626
9.4.4.2.7;11.4.2.7 Protection of the Mimic Control;627
9.4.4.2.8;11.4.2.8 Mimic Structural Design Technology;628
9.4.4.2.9;11.4.2.9 Mimic Construction Implementation Technology;629
9.4.4.3;11.4.3 Defense Effect Test and Evaluation;630
9.4.4.4;11.4.4 Comprehensive Use of Defense Capability;631
9.4.4.5;11.4.5 Issues Needing Continuous Attention;632
9.4.4.6;11.4.6 Emphasizing the Natural and Inspired Solutions;632
9.4.5;References;633
9.5;Chapter 12: Examples of Mimic Defense Application;634
9.5.1;12.1 Mimic Router Verification System;634
9.5.1.1;12.1.1 Threat Design;634
9.5.1.2;12.1.2 Designing Idea;635
9.5.1.3;12.1.3 DHR-Based Router Mimic Defense Model;637
9.5.1.4;12.1.4 System Architecture Design;639
9.5.1.4.1;12.1.4.1 Overall Framework;639
9.5.1.4.2;12.1.4.2 Function Unit Design;640
9.5.1.5;12.1.5 Mimic Transformation of the Existing Network;645
9.5.1.6;12.1.6 Feasibility and Security Analysis;646
9.5.2;12.2 Network Storage Verification System;647
9.5.2.1;12.2.1 Overall Plan;647
9.5.2.2;12.2.2 Arbiter;649
9.5.2.3;12.2.3 Metadata Server Cluster;650
9.5.2.4;12.2.4 Distributed Data Server;650
9.5.2.5;12.2.5 The Client;651
9.5.2.6;12.2.6 System Security Test and Result Analysis;652
9.5.3;12.3 Mimic-Structured Web Server Verification System;654
9.5.3.1;12.3.1 Threat Analysis;654
9.5.3.2;12.3.2 Designing Idea;655
9.5.3.3;12.3.3 System Architecture Design;656
9.5.3.4;12.3.4 Functional Unit Design;658
9.5.3.4.1;12.3.4.1 Request Dispatching and Balancing (RDB) Module;658
9.5.3.4.2;12.3.4.2 Dissimilar Redundant Response Voter;660
9.5.3.4.3;12.3.4.3 Dynamically Executing Scheduler;660
9.5.3.4.4;12.3.4.4 Dissimilar Virtual Web Server Pool;662
9.5.3.4.5;12.3.4.5 Primary Controller;662
9.5.3.4.6;12.3.4.6 Database Instruction Labelling (DIL) Module;663
9.5.3.5;12.3.5 Prototype Design and Realization;665
9.5.3.6;12.3.6 Attack Difficulty Evaluation;666
9.5.3.7;12.3.7 Cost Analysis;671
9.5.4;12.4 Cloud Computing and Virtualization Mimic Construction;671
9.5.4.1;12.4.1 Basic Layers of Cloud Computing;672
9.5.4.2;12.4.2 Cloud Computing Architecture Layers;672
9.5.4.3;12.4.3 Virtualized DHR Construction;674
9.5.5;12.5 Application Consideration for Software Design;675
9.5.5.1;12.5.1 Effect of Randomly Invoking Mobile Attack Surface;676
9.5.5.2;12.5.2 Guard Against Hidden Security Threats from Third Parties;676
9.5.5.3;12.5.3 Typical Mimic Defense Effects;676
9.5.6;12.6 Commonality Induction of System-Level Applications;677
9.5.7;References;677
9.6;Chapter 13: Testing and Evaluation of the Mimic Defense Principle Verification System;679
9.6.1;13.1 Mimic Defense Principle Verification in the Router Environment;680
9.6.1.1;13.1.1 Design of Test Methods for Mimic-Structured Routers;680
9.6.1.2;13.1.2 Basic Router Function and Performance Test;682
9.6.1.2.1;13.1.2.1 Routing Protocol Functional Test;682
9.6.1.2.2;13.1.2.2 Forwarding Performance Comparison Test;683
9.6.1.3;13.1.3 Test of the Mimic Defense Mechanism and Result Analysis;684
9.6.1.3.1;13.1.3.1 Data Transformation Function Test;684
9.6.1.3.2;13.1.3.2 Data Stream Fingerprint Function Test;686
9.6.1.3.3;13.1.3.3 Protocol Executor Random Display Test;687
9.6.1.3.4;13.1.3.4 Protocol Executor Routing Abnormity Monitoring and Handling Test;688
9.6.1.3.5;13.1.3.5 Endogenous Flow Interception Test;689
9.6.1.4;13.1.4 Defense Effect Test and Result Analysis;690
9.6.1.4.1;13.1.4.1 Attack Models and Testing Scenarios;691
9.6.1.4.2;13.1.4.2 System Information Scanning Test;691
9.6.1.4.3;13.1.4.3 Mimic Interface Vulnerability Detection Test;693
9.6.1.4.4;13.1.4.4 Test of Difficulty in Vulnerability Exploitation Within Mimic Interface;693
9.6.1.4.5;13.1.4.5 Test of Difficulty in Utilizing Backdoors in the Mimic Interface;695
9.6.1.5;13.1.5 Test Summary of Mimic-Structured Router;698
9.6.2;13.2 Mimic Defense Principle Verification in the Web Server Environment;698
9.6.2.1;13.2.1 Design of Test Methods for Mimic-Structured Web Servers;698
9.6.2.1.1;13.2.1.1 Test Process Design;699
9.6.2.1.2;13.2.1.2 Test Environment Setting;700
9.6.2.2;13.2.2 Basic Functional Test and Compatibility Test for Web Servers;700
9.6.2.2.1;13.2.2.1 HTTP Protocol Function Test;701
9.6.2.2.2;13.2.2.2 Page Compatibility Comparison Test;702
9.6.2.3;13.2.3 Mimic Defense Mechanism Test and Result Analysis;703
9.6.2.4;13.2.4 Defense Effect Test and Result Analysis;704
9.6.2.4.1;13.2.4.1 Scanning Detection Test;704
9.6.2.4.2;13.2.4.2 Operating System Security Test;704
9.6.2.4.3;13.2.4.3 Data Security Test;707
9.6.2.4.4;13.2.4.4 Anti-Trojan Test;707
9.6.2.4.5;13.2.4.5 Web Application Attack Test;710
9.6.2.5;13.2.5 Web Server Performance Test;710
9.6.2.5.1;13.2.5.1 Benchmark Web Server Performance Testing;712
9.6.2.5.2;13.2.5.2 DIL Module Performance Test;713
9.6.2.5.3;13.2.5.3 System Overall Performance Test;713
9.6.2.6;13.2.6 Summary of the Web Principle Verification System Test;714
9.6.3;13.3 Test Conclusions and Prospects;714
9.6.4;References;717
9.7;Chapter 14: Application Demonstration and Current Network Testing of Mimic Defense;718
9.7.1;14.1 Overview;718
9.7.2;14.2 Application Demonstration of the Mimic-Structured Router;719
9.7.2.1;14.2.1 Status Quo of the Pilot Network;720
9.7.2.1.1;14.2.1.1 Threat Analysis;720
9.7.2.1.2;14.2.1.2 Application Scenario;720
9.7.2.1.3;14.2.1.3 Product Plan;721
9.7.2.1.4;14.2.1.4 Application Deployment;724
9.7.2.1.5;14.2.1.5 Cost Analysis;726
9.7.2.1.6;14.2.1.6 Application Outcome;728
9.7.2.2;14.2.2 Current Network Testing;728
9.7.2.2.1;14.2.2.1 Testing Purpose;728
9.7.2.2.2;14.2.2.2 Testing Plan;728
9.7.2.2.3;14.2.2.3 Testing and Evaluation Items;729
9.7.2.2.4;14.2.2.4 Current Network Testing;729
9.7.2.2.5;14.2.2.5 Testing and Evaluation;731
9.7.3;14.3 Mimic-Structured Web Server;731
9.7.3.1;14.3.1 Application Demonstration;731
9.7.3.1.1;14.3.1.1 Application of the Mimic-Structured Web Server (MSWS) in a Financial Enterprise;731
9.7.3.1.2;14.3.1.2 Application of the MSWS on a Government Website;734
9.7.3.1.3;14.3.1.3 Application of the Mimic-Structured Web Virtual Host (MSWVH) in Gianet Fast Cloud (GFC);740
9.7.3.2;14.3.2 Current Network Testing;745
9.7.3.2.1;14.3.2.1 Testing of the MSWS;745
9.7.3.2.2;14.3.2.2 Testing of the MSWVH;752
9.7.4;14.4 Mimic-Structured Domain Name Server (MSDN Server);756
9.7.4.1;14.4.1 Application Demonstration;756
9.7.4.1.1;14.4.1.1 Threat Analysis;756
9.7.4.1.2;14.4.1.2 Application Scenario;758
9.7.4.1.3;14.4.1.3 Product Plan;758
9.7.4.1.4;14.4.1.4 Application Deployment;761
9.7.4.1.5;14.4.1.5 Cost Analysis;762
9.7.4.1.6;14.4.1.6 Application Effect;763
9.7.4.2;14.4.2 Testing and Evaluation;764
9.7.4.2.1;14.4.2.1 CUHN;764
9.7.4.2.2;14.4.2.2 Gianet;767
9.7.5;14.5 Conclusions and Prospects;769




