Albert / Tullis | Measuring the User Experience | E-Book | www2.sack.de
E-Book

E-Book, Englisch, 336 Seiten

Albert / Tullis Measuring the User Experience

Collecting, Analyzing, and Presenting Usability Metrics
1. Auflage 2010
ISBN: 978-0-08-055826-4
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark

Collecting, Analyzing, and Presenting Usability Metrics

E-Book, Englisch, 336 Seiten

ISBN: 978-0-08-055826-4
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark



Measuring the User Experience provides the first single source of practical information to enable usability professionals and product developers to effectively measure the usability of any product by choosing the right metric, applying it, and effectively using the information it reveals. Authors Tullis and Albert organize dozens of metrics into six categories: performance, issues-based, self-reported, web navigation, derived, and behavioral/physiological. They explore each metric, considering best methods for collecting, analyzing, and presenting the data. They provide step-by-step guidance for measuring the usability of any type of product using any type of technology. This book is recommended for usability professionals, developers, programmers, information architects, interaction designers, market researchers, and students in an HCI or HFE program.• Presents criteria for selecting the most appropriate metric for every case
• Takes a product and technology neutral approach
• Presents in-depth case studies to show how organizations have successfully used the metrics and the information they revealed

Tom Tullis is Vice President of Usability and User Insight at Fidelity Investments and Adjunct Professor at Bentley University in the Human Factors in Information Design program. He joined Fidelity in 1993 and was instrumental in the development of the company's usability department, including a state-of-the-art Usability Lab. Prior to joining Fidelity, he held positions at Canon Information Systems, McDonnell Douglas, Unisys Corporation, and Bell Laboratories. He and Fidelity's usability team have been featured in a number of publications, including Newsweek , Business 2.0 , Money , The Boston Globe , The Wall Street Journal , and The New York Times.

Albert / Tullis Measuring the User Experience jetzt bestellen!

Weitere Infos & Material


1;Front Cover;1
2;Measuring the User Experience;4
3;Copyright Page;5
4;Table of Contents;8
5;Preface;16
6;Acknowledgments;18
7;CHAPTER 1 Introduction;20
7.1;1.1 Organization of This Book;21
7.2;1.2 What Is Usability?;23
7.3;1.3 Why Does Usability Matter?;24
7.4;1.4 What Are Usability Metrics?;26
7.5;1.5 The Value of Usability Metrics;27
7.6;1.6 Ten Common Myths about Usability Metrics;29
8;CHAPTER 2 Background;34
8.1;2.1 Designing a Usability Study;34
8.1.1;2.1.1 Selecting Participants;35
8.1.2;2.1.2 Sample Size;36
8.1.3;2.1.3 Within-Subjects or Between-Subjects Study;37
8.1.4;2.1.4 Counterbalancing;38
8.1.5;2.1.5 Independent and Dependent Variables;39
8.2;2.2 Types of Data;39
8.2.1;2.2.1 Nominal Data;39
8.2.2;2.2.2 Ordinal Data;40
8.2.3;2.2.3 Interval Data;41
8.2.4;2.2.4 Ratio Data;42
8.3;2.3 Metrics and Data;42
8.4;2.4 Descriptive Statistics;43
8.4.1;2.4.1 Measures of Central Tendency;44
8.4.2;2.4.2 Measures of Variability;45
8.4.3;2.4.3 Confidence Intervals;46
8.5;2.5 Comparing Means;47
8.5.1;2.5.1 Independent Samples;47
8.5.2;2.5.2 Paired Samples;48
8.5.3;2.5.3 Comparing More Than Two Samples;49
8.6;2.6 Relationships between Variables;50
8.6.1;2.6.1 Correlations;51
8.7;2.7 Nonparametric Tests;52
8.7.1;2.7.1 The Chi-Square Test;52
8.8;2.8 Presenting Your Data Graphically;54
8.8.1;2.8.1 Column or Bar Graphs;55
8.8.2;2.8.2 Line Graphs;57
8.8.3;2.8.3 Scatterplots;59
8.8.4;2.8.4 Pie Charts;61
8.8.5;2.8.5 Stacked Bar Graphs;61
8.9;2.9 Summary;63
9;CHAPTER 3 Planning a Usability Study;64
9.1;3.1 Study Goals;64
9.1.1;3.1.1 Formative Usability;64
9.1.2;3.1.2 Summative Usability;65
9.2;3.2 User Goals;66
9.2.1;3.2.1 Performance;66
9.2.2;3.2.2 Satisfaction;66
9.3;3.3 Choosing the Right Metrics: Ten Types of Usability Studies;67
9.3.1;3.3.1 Completing a Transaction;67
9.3.2;3.3.2 Comparing Products;69
9.3.3;3.3.3 Evaluating Frequent Use of the Same Product;69
9.3.4;3.3.4 Evaluating Navigation and/or Information Architecture;70
9.3.5;3.3.5 Increasing Awareness;71
9.3.6;3.3.6 Problem Discovery;71
9.3.7;3.3.7 Maximizing Usability for a Critical Product;72
9.3.8;3.3.8 Creating an Overall Positive User Experience;73
9.3.9;3.3.9 Evaluating the Impact of Subtle Changes;73
9.3.10;3.3.10 Comparing Alternative Designs;74
9.4;3.4 Other Study Details;74
9.4.1;3.4.1 Budgets and Timelines;74
9.4.2;3.4.2 Evaluation Methods;76
9.4.3;3.4.3 Participants;77
9.4.4;3.4.4 Data Collection;78
9.4.5;3.4.5 Data Cleanup;79
9.5;3.5 Summary;80
10;CHAPTER 4 Performance Metrics;82
10.1;4.1 Task Success;83
10.1.1;4.1.1 Collecting Any Type of Success Metric;84
10.1.2;4.1.2 Binary Success;85
10.1.3;4.1.3 Levels of Success;88
10.1.4;4.1.4 Issues in Measuring Success;92
10.2;4.2 Time-on-Task;93
10.2.1;4.2.1 Importance of Measuring Time-on-Task;93
10.2.2;4.2.2 How to Collect and Measure Time-on-Task;93
10.2.3;4.2.3 Analyzing and Presenting Time-on-Task Data;96
10.2.4;4.2.4 Issues to Consider When Using Time Data;98
10.3;4.3 Errors;100
10.3.1;4.3.1 When to Measure Errors;100
10.3.2;4.3.2 What Constitutes an Error?;101
10.3.3;4.3.3 Collecting and Measuring Errors;102
10.3.4;4.3.4 Analyzing and Presenting Errors;103
10.3.5;4.3.5 Issues to Consider When Using Error Metrics;105
10.4;4.4 Efficiency;106
10.4.1;4.4.1 Collecting and Measuring Efficiency;106
10.4.2;4.4.2 Analyzing and Presenting Efficiency Data;107
10.4.3;4.4.3 Efficiency as a Combination of Task Success and Time;109
10.5;4.5 Learnability;111
10.5.1;4.5.1 Collecting and Measuring Learnability Data;112
10.5.2;4.5.2 Analyzing and Presenting Learnability Data;113
10.5.3;4.5.3 Issues to Consider When Measuring Learnability;115
10.6;4.6 Summary;116
11;CHAPTER 5 Issues-Based Metrics;118
11.1;5.1 Identifying Usability Issues;118
11.2;5.2 What Is a Usability Issue?;119
11.2.1;5.2.1 Real Issues versus False Issues;120
11.3;5.3 How to Identify an Issue;121
11.3.1;5.3.1 In-Person Studies;122
11.3.2;5.3.2 Automated Studies;122
11.3.3;5.3.3 When Issues Begin and End;122
11.3.4;5.3.4 Granularity;123
11.3.5;5.3.5 Multiple Observers;123
11.4;5.4 Severity Ratings;124
11.4.1;5.4.1 Severity Ratings Based on the User Experience;124
11.4.2;5.4.2 Severity Ratings Based on a Combination of Factors;125
11.4.3;5.4.3 Using a Severity Rating System;126
11.4.4;5.4.4 Some Caveats about Severity Ratings;127
11.5;5.5 Analyzing and Reporting Metrics for Usability Issues;127
11.5.1;5.5.1 Frequency of Unique Issues;128
11.5.2;5.5.2 Frequency of Issues per Participant;130
11.5.3;5.5.3 Frequency of Participants;130
11.5.4;5.5.4 Issues by Category;131
11.5.5;5.5.5 Issues by Task;132
11.5.6;5.5.6 Reporting Positive Issues;133
11.6;5.6 Consistency in Identifying Usability Issues;133
11.7;5.7 Bias in Identifying Usability Issues;135
11.8;5.8 Number of Participants;136
11.8.1;5.8.1 Five Participants Is Enough;137
11.8.2;5.8.2 Five Participants Is Not Enough;138
11.8.3;5.8.3 Our Recommendation;138
11.9;5.9 Summary;140
12;CHAPTER 6 Self-Reported Metrics;142
12.1;6.1 Importance of Self-Reported Data;142
12.2;6.2 Collecting Self-Reported Data;143
12.2.1;6.2.1 Likert Scales;143
12.2.2;6.2.2 Semantic Differential Scales;144
12.2.3;6.2.3 When to Collect Self-Reported Data;144
12.2.4;6.2.4 How to Collect Self-Reported Data;145
12.2.5;6.2.5 Biases in Collecting Self-Reported Data;145
12.2.6;6.2.6 General Guidelines for Rating Scales;146
12.2.7;6.2.7 Analyzing Self-Reported Data;146
12.3;6.3 Post-Task Ratings;147
12.3.1;6.3.1 Ease of Use;147
12.3.2;6.3.2 After-Scenario Questionnaire;148
12.3.3;6.3.3 Expectation Measure;148
12.3.4;6.3.4 Usability Magnitude Estimation;151
12.3.5;6.3.5 Comparison of Post-Task Self-Reported Metrics;152
12.4;6.4 Post-Session Ratings;154
12.4.1;6.4.1 Aggregating Individual Task Ratings;156
12.4.2;6.4.2 System Usability Scale;157
12.4.3;6.4.3 Computer System Usability Questionnaire;158
12.4.4;6.4.4 Questionnaire for User Interface Satisfaction;158
12.4.5;6.4.5 Usefulness, Satisfaction, and Ease of Use Questionnaire;161
12.4.6;6.4.6 Product Reaction Cards;161
12.4.7;6.4.7 Comparison of Post-Session Self-Reported Metrics;163
12.5;6.5 Using SUS to Compare Designs;166
12.5.1;6.5.1 Comparison of ‘‘Senior-Friendly’’ Websites;166
12.5.2;6.5.2 Comparison of Windows ME and Windows XP;166
12.5.3;6.5.3 Comparison of Paper Ballots;167
12.6;6.6 Online Services;169
12.6.1;6.6.1 Website Analysis and Measurement Inventory;169
12.6.2;6.6.2 American Customer Satisfaction Index;170
12.6.3;6.6.3 OpinionLab;172
12.6.4;6.6.4 Issues with Live-Site Surveys;176
12.7;6.7 Other Types of Self-Reported Metrics;177
12.7.1;6.7.1 Assessing Specific Attributes;177
12.7.2;6.7.2 Assessing Specific Elements;180
12.7.3;6.7.3 Open-Ended Questions;181
12.7.4;6.7.4 Awareness and Comprehension;182
12.7.5;6.7.5 Awareness and Usefulness Gaps;184
12.8;6.8 Summary;185
13;CHAPTER 7 Behavioral and Physiological Metrics;186
13.1;7.1 Observing and Coding Overt Behaviors;186
13.1.1;7.1.1 Verbal Behaviors;187
13.1.2;7.1.2 Nonverbal Behaviors;188
13.2;7.2 Behaviors Requiring Equipment to Capture;190
13.2.1;7.2.1 Facial Expressions;190
13.2.2;7.2.2 Eye-Tracking;194
13.2.3;7.2.3 Pupillary Response;199
13.2.4;7.2.4 Skin Conductance and Heart Rate;202
13.2.5;7.2.5 Other Measures;205
13.3;7.3 Summary;207
14;CHAPTER 8 Combined and Comparative Metrics;210
14.1;8.1 Single Usability Scores;210
14.1.1;8.1.1 Combining Metrics Based on Target Goals;211
14.1.2;8.1.2 Combining Metrics Based on Percentages;212
14.1.3;8.1.3 Combining Metrics Based on z-Scores;217
14.1.4;8.1.4 Using SUM: Single Usability Metric;221
14.2;8.2 Usability Scorecards;222
14.3;8.3 Comparison to Goals and Expert Performance;225
14.3.1;8.3.1 Comparison to Goals;225
14.3.2;8.3.2 Comparison to Expert Performance;227
14.4;8.4 Summary;229
15;CHAPTER 9 Special Topics;230
15.1;9.1 Live Website Data;230
15.1.1;9.1.1 Server Logs;230
15.1.2;9.1.2 Click-Through Rates;232
15.1.3;9.1.3 Drop-Off Rates;234
15.1.4;9.1.4 A/B Studies;235
15.2;9.2 Card-Sorting Data;236
15.2.1;9.2.1 Analyses of Open Card-Sort Data;237
15.2.2;9.2.2 Analyses of Closed Card-Sort Data;244
15.3;9.3 Accessibility Data;246
15.4;9.4 Return-on-Investment Data;250
15.5;9.5 Six Sigma;253
15.6;9.6 Summary;255
16;CHAPTER 10 Case Studies;256
16.1;10.1 Redesigning a Website Cheaply and Quickly;256
16.1.1;10.1.1 Phase 1: Testing Competitor Websites;256
16.1.2;10.1.2 Phase 2: Testing Three Different Design Concepts;258
16.1.3;10.1.3 Phase 3: Testing a Single Design;262
16.1.4;10.1.4 Conclusion;263
16.1.5;10.1.5 Biography;263
16.2;10.2 Usability Evaluation of a Speech Recognition IVR;263
16.2.1;10.2.1 Method;263
16.2.2;10.2.2 Results: Task-Level Measurements;264
16.2.3;10.2.3 PSSUQ;265
16.2.4;10.2.4 Participant Comments;265
16.2.5;10.2.5 Usability Problems;266
16.2.6;10.2.6 Adequacy of Sample Size;266
16.2.7;10.2.7 Recommendations Based on Participant Behaviors and Comments;269
16.2.8;10.2.8 Discussion;270
16.2.9;10.2.9 Biography;270
16.2.10;10.2.10 References;271
16.3;10.3 Redesign of the CDC.gov Website;271
16.3.1;10.3.1 Usability Testing Levels;272
16.3.2;10.3.2 Baseline Test;272
16.3.3;10.3.3 Task Scenarios;273
16.3.4;10.3.4 Qualitative Findings;274
16.3.5;10.3.5 Wireframing and FirstClick Testing;275
16.3.6;10.3.6 Final Prototype Testing (Prelaunch Test);277
16.3.7;10.3.7 Conclusions;280
16.3.8;10.3.8 Biographies;281
16.3.9;10.3.9 References;281
16.4;10.4 Usability Benchmarking: Mobile Music and Video;282
16.4.1;10.4.1 Project Goals and Methods;282
16.4.2;10.4.2 Qualitative and Quantitative Data;282
16.4.3;10.4.3 Research Domain;282
16.4.4;10.4.4 Comparative Analysis;283
16.4.5;10.4.5 Study Operations: Number of Respondents;283
16.4.6;10.4.6 Respondent Recruiting;284
16.4.7;10.4.7 Data Collection;284
16.4.8;10.4.8 Time to Complete;285
16.4.9;10.4.9 Success or Failure;285
16.4.10;10.4.10 Number of Attempts;285
16.4.11;10.4.11 Perception Metrics;285
16.4.12;10.4.12 Qualitative Findings;286
16.4.13;10.4.13 Quantitative Findings;286
16.4.14;10.4.14 Summary Findings and SUM Metrics;286
16.4.15;10.4.15 Data Manipulation and Visualization;286
16.4.16;10.4.16 Discussion;288
16.4.17;10.4.17 Benchmark Changes and Future Work;289
16.4.18;10.4.18 Biographies;289
16.4.19;10.4.19 References;289
16.5;10.5 Measuring the Effects of Drug Label Design and Similarity on Pharmacists’ Performance;290
16.5.1;10.5.1 Participants;291
16.5.2;10.5.2 Apparatus;291
16.5.3;10.5.3 Stimuli;291
16.5.4;10.5.4 Procedure;294
16.5.5;10.5.5 Analysis;295
16.5.6;10.5.6 Results and Discussion;296
16.5.7;10.5.7 Biography;298
16.5.8;10.5.8 References;298
16.6;10.6 Making Metrics Matter;299
16.6.1;10.6.1 OneStart: Indiana University’s Enterprise Portal Project;299
16.6.2;10.6.2 Designing and Conducting the Study;300
16.6.3;10.6.3 Analyzing and Interpreting the Results;301
16.6.4;10.6.4 Sharing the Findings and Recommendations;302
16.6.5;10.6.5 Reflecting on the Impact;305
16.6.6;10.6.6 Conclusion;306
16.6.7;10.6.7 Acknowledgment;306
16.6.8;10.6.8 Biography;306
16.6.9;10.6.9 References;306
17;CHAPTER 11 Moving Forward;308
17.1;11.1 Sell Usability and the Power of Metrics;308
17.2;11.2 Start Small and Work Your Way Up;309
17.3;11.3 Make Sure You Have the Time and Money;310
17.4;11.4 Plan Early and Often;311
17.5;11.5 Benchmark Your Products;312
17.6;11.6 Explore Your Data;313
17.7;11.7 Speak the Language of Business;314
17.8;11.8 Show Your Confidence;314
17.9;11.9 Don’t Misuse Metrics;315
17.10;11.10 Simplify Your Presentation;316
18;References;318
19;Index;326



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.