E-Book, Englisch, 407 Seiten
Sadeghi / Naccache Towards Hardware-Intrinsic Security
1. Auflage 2010
ISBN: 978-3-642-14452-3
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
Foundations and Practice
E-Book, Englisch, 407 Seiten
Reihe: Information Security and Cryptography
ISBN: 978-3-642-14452-3
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
Hardware-intrinsic security is a young field dealing with secure secret key storage. By generating the secret keys from the intrinsic properties of the silicon, e.g., from intrinsic Physical Unclonable Functions (PUFs), no permanent secret key storage is required anymore, and the key is only present in the device for a minimal amount of time. The field is extending to hardware-based security primitives and protocols such as block ciphers and stream ciphers entangled with the hardware, thus improving IC security. While at the application level there is a growing interest in hardware security for RFID systems and the necessary accompanying system architectures. This book brings together contributions from researchers and practitioners in academia and industry, an interdisciplinary group with backgrounds in physics, mathematics, cryptography, coding theory and processor theory. It will serve as important background material for students and practitioners, and will stimulate much further research and development.
Autoren/Hrsg.
Weitere Infos & Material
1;Foreword;5
2;Contents;8
3;List of Contributors;11
4;Part I Physically Unclonable Functions (PUFs);15
4.1;Physically Unclonable Functions: A Study on the State of the Art and Future Research Directions;16
4.1.1;Roel Maes and Ingrid Verbauwhede;16
4.1.1.1;1 Introduction;16
4.1.1.2;2 PUF Terminology and Measures;17
4.1.1.2.1;2.1 Challenges and Responses;17
4.1.1.2.2;2.2 Inter- and Intra-distance Measures;18
4.1.1.2.3;2.3 Environmental Effects;19
4.1.1.3;3 PUF Instantiations;19
4.1.1.3.1;3.1 Non-electronic PUFs;20
4.1.1.3.2;3.2 Analog Electronic PUFs;23
4.1.1.3.3;3.3 Delay-Based Intrinsic PUFs;25
4.1.1.3.4;3.4 Memory-Based Intrinsic PUFs;29
4.1.1.3.5;3.5 PUF Concepts;32
4.1.1.4;4 PUF Properties;34
4.1.1.4.1;4.1 Property Description;34
4.1.1.4.2;4.2 Property Check;36
4.1.1.4.3;4.3 Least Common Subset of PUF Properties;37
4.1.1.5;5 PUF Application Scenarios;41
4.1.1.5.1;5.1 System Identification;41
4.1.1.5.2;5.2 Secret Key Generation;42
4.1.1.5.3;5.3 Hardware-Entangled Cryptography;42
4.1.1.6;6 PUF Discussions and Some Open Questions;43
4.1.1.6.1;6.1 Predictability Versus Implementation Size;43
4.1.1.6.2;6.2 Formalization of PUF Properties;44
4.1.1.6.3;6.3 Reporting on PUF Implementation Results;45
4.1.1.7;7 Conclusion;46
4.1.1.8;References;47
4.2;Hardware Intrinsic Security from Physically Unclonable Functions;51
4.2.1;Helena Handschuh, Geert-Jan Schrijen, and Pim Tuyls;51
4.2.1.1;1 Introduction;51
4.2.1.2;2 Rethinking Secure Key Storage Mechanisms;53
4.2.1.2.1;2.1 Limitations of Current Key Storage Mechanisms;53
4.2.1.2.2;2.2 A Radical New Approach to Secure Key Storage;54
4.2.1.3;3 Hardware Intrinsic Security;55
4.2.1.3.1;3.1 Physically Unclonable Functions;55
4.2.1.3.2;3.2 Examples of PUFs;56
4.2.1.3.3;3.3 Secure Key Storage Based on PUFs;59
4.2.1.4;4 Quality of a PUF;60
4.2.1.4.1;4.1 Reliability;61
4.2.1.4.2;4.2 Security;62
4.2.1.5;5 Conclusions;63
4.2.1.6;References;64
4.3;From Statistics to Circuits: Foundations for Future Physical Unclonable Functions;66
4.3.1;Inyoung Kim, Abhranil Maiti, Leyla Nazhandali, Patrick Schaumont, Vignesh Vivekraja, and Huaiye Zhang;66
4.3.1.1;1 Introduction;66
4.3.1.2;2 Components and Quality Factors of a PUF Design;68
4.3.1.2.1;2.1 Components of a PUF;68
4.3.1.2.2;2.2 PUF Quality Factors;69
4.3.1.2.3;2.3 Sources of CMOS Variability and Compensation of Unwanted Variability;70
4.3.1.3;3 Circuit-Level Optimization of PUF;72
4.3.1.3.1;3.1 Methodology;73
4.3.1.3.2;3.2 Background: Operating Voltage and Body Bias;73
4.3.1.3.3;3.3 Effect of Operating Voltage and Body Bias on PUF;75
4.3.1.4;4 Architecture-Level Optimization of PUF;76
4.3.1.4.1;4.1 Compensation of Environmental Effects;77
4.3.1.4.2;4.2 Compensation of Correlated Process Variations;78
4.3.1.5;5 Identity Mapping and Testing;79
4.3.1.5.1;5.1 Statistical Preliminaries;80
4.3.1.5.2;5.2 A New Test Statistic: Q;82
4.3.1.5.3;5.3 Experimental Results;85
4.3.1.5.4;5.4 Compensation of Environmental Effects;86
4.3.1.5.5;5.5 Open Challenges;87
4.3.1.6;6 Conclusions;87
4.3.1.7;References;87
4.4;Strong PUFs: Models, Constructions, and Security Proofs;90
4.4.1;Ulrich Rührmair, Heike Busch, and Stefan Katzenbeisser;90
4.4.1.1;1 Introduction;90
4.4.1.2;2 Implementations of Strong Physical Unclonable Functions;91
4.4.1.3;3 Physical Unclonable Functions: Toward a Formal Definition;93
4.4.1.3.1;3.1 Physical One-Way Functions;93
4.4.1.3.2;3.2 Physical Unclonable Functions;95
4.4.1.3.3;3.3 Physical Random Functions;97
4.4.1.4;4 Alternative Attack Models;97
4.4.1.4.1;4.1 Semi-formal Models for Strong PUFs;98
4.4.1.4.2;4.2 The Digital Attack Model;100
4.4.1.5;5 Identification Schemes Based on Strong PUFs;101
4.4.1.5.1;5.1 PUF-Based Identification Schemes;101
4.4.1.5.2;5.2 Security of PUF-Based Identification in the Digital Attack Model;102
4.4.1.6;6 Conclusions;105
4.4.1.7;References;105
5;Part II Hardware-Based Cryptography;108
5.1;Leakage Resilient Cryptography in Practice;109
5.1.1;François-Xavier Standaert, Olivier Pereira, Yu Yu, Jean-Jacques Quisquater, Moti Yung, and Elisabeth Oswald;109
5.1.1.1;1 Introduction;109
5.1.1.2;2 Background;112
5.1.1.2.1;2.1 Notations;112
5.1.1.2.2;2.2 Definition of a Leakage Function;112
5.1.1.3;3 Unpredictability vs. Indistinguishability;114
5.1.1.4;4 Physical Assumptions: Local vs. Global Approach;117
5.1.1.4.1;4.1 Analogy with Classical Cryptanalysis;120
5.1.1.5;5 Leakage Resilient PRGs;120
5.1.1.5.1;5.1 On the Difficulty of Modeling a Leakage Function;120
5.1.1.5.2;5.2 Theoretical Security Analysis and Limitations;122
5.1.1.5.3;5.3 Proving Leakage Resilience with Random Oracles;123
5.1.1.5.4;5.4 Practical Security Analysis;126
5.1.1.6;6 Initialization Issues;129
5.1.1.6.1;6.1 Breaking [34] with a Standard DPA;129
5.1.1.6.2;6.2 Secure Initialization Process;130
5.1.1.6.3;6.3 A More Elegant (and Standard) Construction;131
5.1.1.6.4;6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages;132
5.1.1.7;7 Generalization to PRFs;134
5.1.1.8;8 Remark on the Impossibility of Proving the Leakage Resilience for the Forward Secure PRG of Fig. 6a in the Standard Model;136
5.1.1.9;9 Open Problems;136
5.1.1.10;10 Further Details;138
5.1.1.10.1;10.1 Security Metric;138
5.1.1.10.2;10.2 Proof of Theorem 1;138
5.1.1.10.3;10.3 Proof of Theorem 2;140
5.1.1.11;References;142
5.2;Memory Leakage-Resilient Encryption Based on Physically Unclonable Functions;145
5.2.1;Frederik Armknecht, Roel Maes, Ahmad-Reza Sadeghi, Berk Sunar,and Pim Tuyls;145
5.2.1.1;1 Introduction;145
5.2.1.2;2 Related Work;147
5.2.1.3;3 Memory Attacks;148
5.2.1.4;4 Preliminaries;149
5.2.1.5;5 Physically Unclonable Functions;150
5.2.1.6;6 Pseudorandom Functions Based on PUFs;152
5.2.1.7;7 Encrypting with PUF-(w)PRFs;156
5.2.1.7.1;7.1 General Thoughts;156
5.2.1.7.2;7.2 A Stream Cipher Based on PUF-PRFs;157
5.2.1.7.3;7.3 A Block Cipher Based on PUF-PRFs;159
5.2.1.8;8 SRAM PRFs;163
5.2.1.8.1;8.1 Physical Implementation Details of Static Random Access Memory (SRAM);164
5.2.1.8.2;8.2 The SRAM PUF Construction;164
5.2.1.8.3;8.3 SRAM PUF Parameters and Experimental Validation;166
5.2.1.8.4;8.4 From SRAM PUF to SRAM PRF;167
5.2.1.8.5;8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs;170
5.2.1.9;9 Conclusions;171
5.2.1.10;References;171
6;Part III Hardware Attacks;175
6.1;Hardware Trojan Horses;176
6.1.1;Mohammad Tehranipoor and Berk Sunar;176
6.1.1.1;1 What Is the Untrusted Manufacturer Problem?;176
6.1.1.2;2 Hardware Trojans;178
6.1.1.3;3 A Taxonomy of Hardware Trojans;179
6.1.1.4;4 A High-Level Attack: Shadow Circuits;181
6.1.1.5;5 Trojan Detection Methodologies;182
6.1.1.5.1;5.1 Trojan Detection Using Side-Channel Signal Analysis;182
6.1.1.5.2;5.2 Trojan Activation Methods;187
6.1.1.6;6 Design-for-Hardware-Trust Techniques;189
6.1.1.7;7 Circuit Obfuscation as a Countermeasure;193
6.1.1.8;References;194
6.2;Extracting Unknown Keys from Unknown Algorithms Encrypting Unknown Fixed Messages and Returning No Results;197
6.2.1;Yoo-Jin Baek, Vanessa Gratzer, Sung-Hyun Kim, and David Naccache;197
6.2.1.1;1 Introduction;197
6.2.1.2;2 The Intuition;198
6.2.1.3;3 Notations and Statistical Tools;199
6.2.1.4;4 The Attack;200
6.2.1.4.1;4.1 The Exhaust Routine;200
6.2.1.5;5 Practical Experiments;202
6.2.1.6;6 Implications and Further Research;205
6.2.1.7;References;205
7;Part IV Hardware-Based Policy Enforcement;206
7.1;License Distribution Protocols from Optical Media Fingerprints;207
7.1.1;Ghaith Hammouri, Aykutlu Dana, and Berk Sunar;207
7.1.1.1;1 Introduction;207
7.1.1.2;2 Pits and Lands;209
7.1.1.2.1;2.1 Source of Variation;209
7.1.1.2.2;2.2 Single Location Characterization;211
7.1.1.3;3 Experimental Validation;211
7.1.1.4;4 CD Fingerprinting;215
7.1.1.4.1;4.1 Fuzzy Extractors;216
7.1.1.4.2;4.2 Fingerprint Extraction;217
7.1.1.4.3;4.3 Entropy Estimation and 128-Bit Security;218
7.1.1.5;5 Robustness of the Fingerprint;222
7.1.1.6;6 License Distribution Protocol;222
7.1.1.6.1;6.1 Simple Distribution Protocol;223
7.1.1.6.2;6.2 Secure Reader Protocol;224
7.1.1.6.3;6.3 Online Distribution Protocol;225
7.1.1.7;7 Conclusion;226
7.1.1.8;References;227
7.2;Anti-counterfeiting: Mixing the Physical and the Digital World;229
7.2.1;Darko Kirovski;229
7.2.1.1;1 Introduction;229
7.2.1.1.1;1.1 Classification;230
7.2.1.2;2 Desiderata for Anti-counterfeiting Technologies;230
7.2.1.3;3 Digitizing the Physical World;232
7.2.1.4;4 Applications;234
7.2.1.5;5 Review of Existing Methodologies;234
7.2.1.5.1;5.1 RF-DNA;236
7.2.1.5.2;5.2 Challenge/Response COA Systems;237
7.2.1.6;6 Conclusion;238
7.2.1.7;References;238
8;Part V HardwareSecurity in Contactless Tokens;240
8.1;Anti-counterfeiting, Untraceability and Other Security Challenges for RFID Systems: Public-Key-Based Protocols and Hardware;241
8.1.1;Yong Ki Lee, Lejla Batina, Dave Singelee, Bart Preneel, andIngrid Verbauwhede;241
8.1.1.1;1 Introduction;241
8.1.1.2;2 Security and Privacy Requirements;242
8.1.1.2.1;2.1 Security Objectives;242
8.1.1.2.2;2.2 Privacy Objectives;243
8.1.1.2.3;2.3 General Objectives;244
8.1.1.3;3 State of the Art;245
8.1.1.3.1;3.1 Authentication Protocols Based on Private-Key Cryptography;245
8.1.1.3.2;3.2 Authentication Protocols Based on PUFs;246
8.1.1.3.3;3.3 Authentication Protocols Based on Public-Key Cryptography;246
8.1.1.4;4 Untraceable Authentication Protocols Based on ECC;247
8.1.1.4.1;4.1 Notation;247
8.1.1.4.2;4.2 EC-RAC II;247
8.1.1.4.3;4.3 Randomized Schnorr Protocol;248
8.1.1.4.4;4.4 Man-in-the-Middle Attacks;248
8.1.1.5;5 EC-RAC IV;250
8.1.1.6;6 Search Protocol;251
8.1.1.6.1;6.1 Protocol Description;251
8.1.1.6.2;6.2 Search Protocol Analysis;253
8.1.1.6.3;6.3 Combining Authentication Protocols;255
8.1.1.7;7 Implementation;255
8.1.1.7.1;7.1 Overall Architecture;256
8.1.1.7.2;7.2 New MALU Design;256
8.1.1.7.3;7.3 Performance Evaluation;258
8.1.1.8;8 Conclusions;258
8.1.1.9;References;259
8.2;Contactless Security Token Enhanced Security by Using New Hardware Features in Cryptographic-Based Security Mechanisms;262
8.2.1;Markus Ullmann and Matthias Vögeler;262
8.2.1.1;1 Introduction;262
8.2.1.1.1;1.1 Benefits of Contactless Smart Cards;262
8.2.1.1.2;1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards;262
8.2.1.1.3;1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates;264
8.2.1.2;2 Contactless Security Token;266
8.2.1.2.1;2.1 Flexible Display Technology;266
8.2.1.2.2;2.2 Real-Time Clock;267
8.2.1.2.3;2.3 Buttons;268
8.2.1.3;3 Authenticated Connection Establishment;268
8.2.1.3.1;3.1 Password-Based Cryptographic Protocols;268
8.2.1.3.2;3.2 Password Authenticated Connection Establishment (PACE);268
8.2.1.3.3;3.3 Security Token Operation;270
8.2.1.3.4;3.4 Security Analysis of PACE Using Fresh Passwords;270
8.2.1.3.5;3.5 Brute-Force Online-Attacks on Passwords;271
8.2.1.4;4 Secure Time Synchronization;272
8.2.1.4.1;4.1 Time Values;272
8.2.1.4.2;4.2 Time Server-Based Synchronization Protocols;273
8.2.1.4.3;4.3 Security Requirements for Time Synchronization;274
8.2.1.4.4;4.4 Secure Time Synchronization Protocols;275
8.2.1.4.5;4.5 Security and Performance Analysis;277
8.2.1.5;5 Applications;279
8.2.1.5.1;5.1 Authentication of Internet Services;279
8.2.1.6;6 Conclusion;281
8.2.1.7;References;281
8.3;Enhancing RFID Security and Privacy by Physically UnclonableFunctions;283
8.3.1;Ahmad-Reza Sadeghi, Ivan Visconti, and Christian Wachsmann;283
8.3.1.1;1 Introduction;283
8.3.1.1.1;1.1 Contribution;284
8.3.1.2;2 High-Level RFID System and Requirement Analysis;285
8.3.1.2.1;2.1 System Model;285
8.3.1.2.2;2.2 Trust and Adversary Model;285
8.3.1.2.3;2.3 Security and Privacy Threats;286
8.3.1.2.4;2.4 Security and Privacy Objectives;286
8.3.1.3;3 Related Work;286
8.3.1.3.1;3.1 Privacy-Preserving RFID Protocols;286
8.3.1.3.2;3.2 RFID Protocols Based on Physically Unclonable Functions;287
8.3.1.3.3;3.3 Privacy Models for RFID;289
8.3.1.4;4 RFID Security and Privacy Model of Vaudenay [67];290
8.3.1.4.1;4.1 General Notation;290
8.3.1.4.2;4.2 Pseudorandom Function (PRF);290
8.3.1.4.3;4.3 Physically Unclonable Function (PUF);291
8.3.1.4.4;4.4 System Model;292
8.3.1.4.5;4.5 Adversary Model;293
8.3.1.4.6;4.6 Definition of Correctness, Security, and Privacy;296
8.3.1.5;5 A PUF-Based Destructive-Private RFID Protocol;297
8.3.1.5.1;5.1 Correctness;298
8.3.1.6;6 Security Analysis;299
8.3.1.6.1;6.1 Tag Authentication;299
8.3.1.6.2;6.2 Destructive Privacy;300
8.3.1.7;7 Conclusion;303
8.3.1.8;References;304
9;Part VI Hardware-Based Security Architectures and Applications;308
9.1;Authentication of Processor Hardware Leveraging Performance Limits in Detailed Simulations and Emulations;309
9.1.1;Daniel Y. Deng, Andrew H. Chan, and G. Edward Suh;309
9.1.1.1;1 Introduction;309
9.1.1.2;2 Threat Model;311
9.1.1.3;3 Authentication Approach;312
9.1.1.4;4 Hardware Design;315
9.1.1.4.1;4.1 Microarchitectural Features;315
9.1.1.4.2;4.2 Checksum Computation;316
9.1.1.4.3;4.3 New Instructions;318
9.1.1.4.4;4.4 Non-determinism;318
9.1.1.5;5 Challenge Program;319
9.1.1.6;6 Evaluation;321
9.1.1.6.1;6.1 Overheads;321
9.1.1.6.2;6.2 Effectiveness;322
9.1.1.6.3;6.3 Deterministic Execution;323
9.1.1.6.4;6.4 Security Discussion;324
9.1.1.7;7 Related Work;326
9.1.1.8;8 Conclusion;327
9.1.1.9;References;327
9.2;Signal Authentication in Trusted Satellite Navigation Receivers;330
9.2.1;Markus G. Kuhn;330
9.2.1.1;1 Introduction;330
9.2.1.1.1;1.1 Environmental Assumptions;331
9.2.1.1.2;1.2 Related Technologies;332
9.2.1.1.3;1.3 Goals;333
9.2.1.2;2 Techniques;334
9.2.1.2.1;2.1 Secret Spreading Sequences;334
9.2.1.2.2;2.2 Individual Receiver Antenna Characteristics;337
9.2.1.2.3;2.3 Consistency with Reference Receivers;337
9.2.1.2.4;2.4 Receiver-Internal Plausibility Tests;339
9.2.1.2.5;2.5 Some Other Ideas;343
9.2.1.3;3 Comparison;344
9.2.1.4;4 Conclusions;346
9.2.1.5;References;346
9.3;On the Limits of Hypervisor- and Virtual Machine Monitor-Based Isolation;348
9.3.1;Loic Duflot, Olivier Grumelard, Olivier Levillain, and Benjamin Morin;348
9.3.1.1;1 Introduction;348
9.3.1.2;2 Compartmented Systems;349
9.3.1.2.1;2.1 Traditional Architectures and Definition of a Trusted Computing Base;349
9.3.1.2.2;2.2 Attacker Model;350
9.3.1.3;3 Attack Paths;350
9.3.1.3.1;3.1 Taxonomy of Attack Vectors;350
9.3.1.4;4 Design of a DIMM Backdoor;353
9.3.1.4.1;4.1 Overview of DDR DIMM;353
9.3.1.4.2;4.2 Principle of the Backdoor;354
9.3.1.4.3;4.3 Proof of Concept Implementation;354
9.3.1.5;5 Exploitation;358
9.3.1.5.1;5.1 Difficulties;358
9.3.1.5.2;5.2 Use of the Hidden Functions to Access Sensitive Data;359
9.3.1.5.3;5.3 Use of the Backdoor as a means for Privilege Escalation;361
9.3.1.6;6 Countermeasures;362
9.3.1.7;7 Conclusion and Future Work;363
9.3.1.8;References;363
9.4;Efficient Secure Two-Party Computation with Untrusted Hardware Tokens;366
9.4.1;Kimmo Järvinen, Vladimir Kolesnikov, Ahmad-Reza Sadeghi, and Thomas Schneider;366
9.4.1.1;1 Introduction;366
9.4.1.1.1;1.1 Our Setting, Goals, and Approach;367
9.4.1.1.2;1.2 Envisioned Applications;368
9.4.1.1.3;1.3 Our Contributions and Outline;369
9.4.1.1.4;1.4 Related Work;369
9.4.1.2;2 Preliminaries;370
9.4.1.2.1;2.1 Garbled Circuits (GC);371
9.4.1.3;3 Architecture, System, and Trust Model;372
9.4.1.4;4 Token-Assisted Garbled Circuit Protocols;373
9.4.1.4.1;4.1 Protocols Overview and Security;373
9.4.1.4.2;4.2 Circuit Representation;375
9.4.1.4.3;4.3 GC Creation with Stateful Token (Secure Counter);377
9.4.1.4.4;4.4 GC Creation with Stateless Token (No Counter);378
9.4.1.5;5 Further Optimizations;378
9.4.1.5.1;5.1 Optimizing Memory of Client;379
9.4.1.5.2;5.2 Optimizing Runtime of Token by Caching;379
9.4.1.6;6 Proof-of-Concept Implementation;380
9.4.1.6.1;6.1 Architecture;381
9.4.1.6.2;6.2 Prototype Implementation;382
9.4.1.7;References;383
9.5;Towards Reliable Remote Healthcare Applications Using Combined Fuzzy Extraction;386
9.5.1;Jorge Guajardo, Muhammad Asim, and Milan Petkovic;386
9.5.1.1;1 Introduction;386
9.5.1.2;2 Remote Patient Monitoring Services and Data Reliability Issues;389
9.5.1.2.1;2.1 Data Reliability Issues;390
9.5.1.3;3 Fuzzy Extractors, PUFs, and Biometrics;391
9.5.1.3.1;3.1 Preliminaries;391
9.5.1.3.2;3.2 Physical Unclonable Functions;393
9.5.1.3.3;3.3 Biometrics;395
9.5.1.3.4;3.4 The Need for Fuzzy Extractors;395
9.5.1.4;4 Combining PUFs and Biometrics;398
9.5.1.4.1;4.1 A Practical Simplification;402
9.5.1.4.2;4.2 Other Variations;403
9.5.1.4.3;4.3 Security and Safety;403
9.5.1.5;5 Conclusions;404
9.5.1.6;References;404




