Stephens / Rosenberg | Design Driven Testing | E-Book | www2.sack.de
E-Book

E-Book, Englisch, 368 Seiten

Stephens / Rosenberg Design Driven Testing

Test Smarter, Not Harder
1. ed
ISBN: 978-1-4302-2944-5
Verlag: Apress
Format: PDF
Kopierschutz: 1 - PDF Watermark

Test Smarter, Not Harder

E-Book, Englisch, 368 Seiten

ISBN: 978-1-4302-2944-5
Verlag: Apress
Format: PDF
Kopierschutz: 1 - PDF Watermark



The groundbreaking book Design Driven Testing brings sanity back to the software development process by flipping around the concept of Test Driven Development (TDD)-restoring the concept of using testing to verify a design instead of pretending that unit tests are a replacement for design. Anyone who feels that TDD is 'Too Damn Difficult' will appreciate this book. Design Driven Testing shows that, by combining a forward-thinking development process with cutting-edge automation, testing can be a finely targeted, business-driven, rewarding effort. In other words, you'll learn how to test smarter, not harder. Applies a feedback-driven approach to each stage of the project lifecycle. Illustrates a lightweight and effective approach using a core subset of UML. Follows a real-life example project using Java and Flex/ActionScript. Presents bonus chapters for advanced DDTers covering unit-test antipatterns (and their opposite, 'test-conscious' design patterns), and showing how to create your own test transformation templates in Enterprise Architect.

Matt Stephens is a Java developer, project leader, and technical architect with a financial organization based in central London. He's been developing software commercially for over 15 years, and has led many agile projects through successive customer releases. He has spoken at a number of software conferences on object-oriented development topics, and his writing appears regularly in a variety of software journals and websites, including The Register and ObjectiveView. Matt is the co-author of Extreme Programming Refactored: The Case Against XP (Apress, 2003) with Doug Rosenberg, Agile Development with ICONIX Process (Apress, 2005) with Doug Rosenberg and Mark Collins-Cope, and Use Case Driven Object Modeling with UML: Theory and Practice with Doug Rosenberg (Apress, 2007). Catch Matt online at www.softwarereality.com.

Stephens / Rosenberg Design Driven Testing jetzt bestellen!

Weitere Infos & Material


1;Title Page ;1
2;Copyright Page ;2
3;Contents at a Glance;3
4;Table of contents;4
5;Foreword;13
6;About the Authors;14
7;About the Technical Reviewers;15
8;Acknowledgments;16
9;Prologue;17
10;Part 1: DDT vs. TDD;18
11;Chapter 1: Somebody Has It Backwards;20
11.1;Problems DDT Sets Out to Solve;21
11.1.1;Knowing When You’re Done Is Hard;21
11.1.2;Leaving Testing Until Later Costs More;22
11.1.3;Testing Badly Designed Code Is Hard;22
11.1.4;It’s Easy to Forget Customer-Level Tests;22
11.1.5;Developers Become Complacent;23
11.1.6;Tests Sometimes Lack Purpose;23
11.2;A Quick, Tools-Agnostic Overview of DDT;23
11.2.1;Structure of DDT;23
11.2.2;DDT in Action;26
11.3;How TDD and DDT Differ;27
11.4;Example Project: Introducing the Mapplet 2.0;29
11.5;Summary;32
12;Chapter 2: TDD Using Hello World;34
12.1;Top Ten Characteristics of TDD;35
12.1.1;10. Tests drive the design.;35
12.1.2;9. There is a Total Dearth of Documentation.;35
12.1.3;8. Everything is a unit test;35
12.1.4;7. TDD tests are not quite unit tests (or are they?);36
12.1.5;6. Acceptance tests provide feedback against the requirements.;36
12.1.6;5. TDD lends confidence to make changes.;36
12.1.7;4. Design emerges incrementally;37
12.1.8;3. Some up-front design is OK;37
12.1.9;2. TDD produces a lot of tests.;37
12.1.10;1. TDD is Too Damn Difficult;37
12.2;Login Implemented Using TDD;38
12.2.1;Understand the Requirement;38
12.2.2;Think About the Design;41
12.2.3;Write the First Test-First Test First;42
12.2.4;Write the Login Check Code to Make the Test Pass;46
12.2.5;Create a Mock Object;49
12.2.6;Refactor the Code to See the Design Emerge;51
12.3;Acceptance Testing with TDD;57
12.4;Conclusion: TDD = Too Damn Difficult;58
12.5;Summary;59
13;Chapter 3: “Hello World!” Using DDT;60
13.1;Top Ten Features of ICONIX/DDT;61
13.1.1;10. DDT Includes Business Requirement Tests;61
13.1.2;9. DDT Includes Scenario Tests;61
13.1.3;8. Tests Are Driven from Design;61
13.1.4;7. DDT Includes Controller Tests;62
13.1.5;6. DDT Tests Smarter, Not Harder;62
13.1.6;5. DDT Unit Tests Are “Classical” Unit Tests;62
13.1.7;4. DDT Test Cases Can Be Transformed into Test Code;62
13.1.8;3. DDT Test Cases Lead to Test Plans;62
13.1.9;2. DDT Tests Are Useful to Developers and QA Teams;63
13.1.10;1. DDT Can Eliminate Redundant Effort;63
13.2;Login Implemented Using DDT;63
13.2.1;Step 1: Create a Robustness Diagram;65
13.2.2;Step 2: Create Controller Test Cases;69
13.2.3;Step 3: Add Scenarios;72
13.2.4;Step 4: Transform Controller Test Cases into Classes;74
13.2.5;Step 5: Generate Controller Test Code;77
13.2.6;Step 6: Draw a Sequence Diagram;80
13.2.7;Step 7: Create Unit Test Cases;83
13.2.8;Step 8: Fill in the Test Code;89
13.3;Summary;92
14;Part 2: DDT in the Real World: Mapplet 2.0 Travel Web Site;95
15;Chapter 4: Introducing the Mapplet Project;97
15.1;Top Ten ICONIX Process/DDT Best Practices;98
15.2;10. Create an Architecture;99
15.3;9. Agree on Requirements, and Test Against Them;100
15.4;8. Drive Your Design from the Problem Domain;102
15.5;7. Write Use Cases Against UI Storyboards;105
15.6;6. Write Scenario Tests to Verify That the Use Cases Work;107
15.7;5. Test Against Conceptual and Detailed Designs;111
15.8;4. Update the Model Regularly;111
15.9;3. Keep Test Scripts In-Sync with Requirements;118
15.10;2. Keep Automated Tests Up to Date;119
15.11;1. Compare the Release Candidate with Original Use Cases;119
15.12;Summary;123
16;Chapter 5: Detailed Design and Unit Testing;124
16.1;Top Ten Unit Testing “To Do”s;125
16.1.1;10. Start with a Sequence Diagram;126
16.1.2;9. Identify Test Cases from Your Design;128
16.1.3;8. Write Scenarios for Each Test Case;130
16.1.4;7. Test Smarter: Avoid Overlapping Tests;132
16.1.5;6. Transform Your Test Cases into UML Classes;133
16.1.6;5. Write Unit Tests and Accompanying Code;138
16.1.6.1;Writing the “No Hotels” Test;138
16.1.6.2;Implementing SearchHotelService;139
16.1.7;4. Write White Box Unit Tests;141
16.1.7.1;Implement a Stunt Service;142
16.1.7.2;Update the Test Code to Use the Stunt Service;145
16.1.8;3. Use a Mock Object Framework;146
16.1.8.1;The Stunt Service Approach;146
16.1.8.2;The Mock Object Framework Approach;147
16.1.9;2. Test Algorithmic Logic with Unit Tests;149
16.1.10;1. Write a Separate Suite of Integration Tests;149
16.2;Summary;151
17;Chapter 6: Conceptual Design and Controller Testing;152
17.1;Top Ten Controller Testing “To-Do” List;154
17.1.1;10. Start with a Robustness Diagram;154
17.1.1.1;The Use Case;154
17.1.1.2;Conceptual Design from Which to Drive Controller Tests;155
17.1.2;9. Identify Test Cases from Your Controllers;158
17.1.3;8. Define One or More Scenarios per Test Case;161
17.1.3.1;Understanding Test Scenarios;161
17.1.3.2;Identifying the Input Values for a Test Scenario;161
17.1.3.3;Using EA to Create Test Scenarios;163
17.1.4;7. Fill in Description, Input, and Acceptance Criteria;164
17.1.5;6. Generate Test Classes;165
17.1.5.1;Before Generating Your Tests;165
17.1.5.2;Generating the Tests;167
17.1.6;5. Implement the Tests;170
17.1.7;4. Write Code That’s Easy to Test;171
17.1.8;3. Write “Gray Box” Controller Tests;173
17.1.9;2. String Controller Tests Together;174
17.1.10;1. Write a Separate Suite of Integration Tests;176
17.2;Summary;177
18;Chapter 7: Acceptance Testing: Expanding Use Case Scenarios;178
18.1;Top Ten Scenario Testing “To-Do” List;179
18.2;Mapplet Use Cases;180
18.2.1;10. Start with a Narrative Use Case;181
18.2.2;9. Transform to a Structured Scenario;185
18.2.3;8. Make Sure All Paths Have Steps;186
18.2.4;7. Add Pre-conditions and Post-conditions;187
18.2.5;6. Generate an Activity Diagram;187
18.2.6;5. Expand “Threads” Using “Create External Tests”;189
18.2.7;4. Put the Test Case on a Test Case Diagram;190
18.2.8;3. Drill into the EA Testing View;191
18.2.9;2. Add Detail to the Test Scenarios;192
18.2.10;1. Generate a Test Plan Document;192
18.3;And the Moral of the Story Is . . .;194
18.4;Summary;197
19;Chapter 8: Acceptance Testing: Business Requirements;198
19.1;Top Ten Requirements Testing “To-Do” List;199
19.1.1;10. Start with a Domain Model;200
19.1.2;9. Write Business Requirement Tests;202
19.1.3;8. Model and Organize Requirements;203
19.1.4;7. Create Test Cases from Requirements;207
19.1.5;6. Review Your Plan with the Customer;209
19.1.6;5. Write Manual Test Scripts;213
19.1.7;4. Write Automated Requirements Tests;213
19.1.8;3. Export the Test Cases;213
19.1.9;2. Make the Test Cases Visible;214
19.1.10;1. Involve Your Team!;214
19.2;Summary;215
20;Part 3 Advanced DDT;216
21;Chapter 9: Unit Testing Antipatterns (The “Don’ts”);218
21.1;The Temple of Doom (aka The Code);219
21.1.1;The Big Picture;220
21.1.2;The HotelPriceCalculator Class;221
21.1.3;Supporting Classes;223
21.1.4;Service Classes;224
21.2;The Antipatterns;226
21.2.1;10. The Complex Constructor;226
21.2.2;9. The Stratospheric Class Hierarchy;228
21.2.3;8. The Static Hair-Trigger;230
21.2.4;7. Static Methods and Variables;232
21.2.5;6. The Singleton Design Pattern;233
21.2.6;5. The Tightly Bound Dependency;236
21.2.7;4. Business Logic in the UI Code;238
21.2.8;3. Privates on Parade;239
21.2.9;2. Service Objects That Are Declared Final;240
21.2.10;1. Half-Baked Features from the Good Deed Coder;240
21.3;Summary;241
22;Chapter 10: Design for Easier Testing;242
22.1;Top Ten “Design for Testing” To-Do List;243
22.2;The Temple of Doom—Thoroughly Expurgated;244
22.2.1;The Use Case—Figuring Out What We Want to Do;244
22.2.2;Identify the Controller Tests;247
22.2.3;Calculate Overall Price Test;248
22.2.4;Retrieve Latest Price Test;249
22.3;Design for Easier Testing;250
22.3.1;10. Keep Initialization Code Out of the Constructor;250
22.3.2;9. Use Inheritance Sparingly;251
22.3.3;8. Avoid Using Static Initializer Blocks;252
22.3.4;7. Use Object-Level Methods and Variables;253
22.3.5;6. Avoid the Singleton Design Pattern;253
22.3.6;5. Keep Your Classes Decoupled;255
22.3.7;4. Keep Business Logic Out of the UI Code;256
22.3.8;3. Use Black Box and Gray Box Testing;261
22.3.9;2. Reserve the “Final” Modifier for Constants—Generally Avoid Marking Complex Types Such as Service Objects as Final;262
22.3.10;1. Stick to the Use Cases and the Design;263
22.4;Detailed Design for the Quote Hotel Price Use Case;263
22.4.1;Controller Test: Calculate Overall Price;265
22.4.2;Controller Test: Retrieve Latest Price Test;265
22.4.3;The Rebooted Design and Code;266
22.5;Summary;267
23;Chapter 11: Automated Integration Testing;268
23.1;Top-Ten Integration Testing “To-Do” List;269
23.2;10. Look for Test Patterns in Your Conceptual Design;269
23.3;9. Don’t Forget Security Tests;271
23.3.1;Security Testing: SQL Injection Attacks;271
23.3.2;Security Testing: Set Up Secure Sessions;272
23.4;8. Decide the “Level” of Integration Test to Write;273
23.4.1;How the Three Levels Differ;273
23.4.2;Knowing Which Level of Integration Test to Write;273
23.5;7. Drive Unit/Controller-Level Tests from Conceptual Design;274
23.6;6. Drive Scenario Tests from Use Case Scenarios;277
23.7;5. Write End-to-End Scenario Tests;278
23.7.1;Emulating the Steps in a Scenario;278
23.7.2;Sharing a Test Database;279
23.7.3;Mapplet Example: The “Advanced Search” Use Case;281
23.7.4;A Vanilla xUnit Scenario Test;281
23.8;4. Use a “Business-Friendly” Testing Framework;282
23.9;3. Test GUI Code as Part of Your Scenario Tests;284
23.10;2. Don’t Underestimate the Difficulty of Integration Testing;285
23.10.1;Network Latency;287
23.10.2;Database Metadata Changes;287
23.10.3;Randomly Mutating (aka “Agile”) Interfaces;288
23.10.4;Bugs in the Remote System;288
23.10.5;Cloudy Days;288
23.11;1. Don’t Underestimate the Value of Integration Tests;289
23.12;Key Points When Writing Integration Tests;289
23.13;Summary;291
24;Chapter 12: Unit Testing Algorithms;292
24.1;Top Ten Algorithm Testing “To-Do”s;293
24.1.1;10. Start with a Controller from the Conceptual Design;294
24.1.2;9. Expand the Controllers into an Algorithm Design;296
24.1.3;8. Tie the Diagram Loosely to Your Domain Model;298
24.1.4;7. Split Up Decision Nodes Involving More Than One Check;299
24.1.5;6. Create a Test Case for Each Node;299
24.1.6;5. Define Test Scenarios for Each Test Case;301
24.1.7;4. Create Input Data from a Variety of Sources;304
24.1.8;3. Assign the Logic Flow to Individual Methods and Classes;305
24.1.9;2. Write “White Box” Unit Tests;310
24.1.9.1;Testing the “At least one candidate returned” Decision Node;311
24.1.9.2;Testing the “Exactly one candidate or one is a 100% match” Decision Node;312
24.1.9.3;Send in the Spy Object;315
24.1.9.4;Break the Code into Smaller Methods;320
24.1.10;1. Apply DDT to Other Design Diagrams;321
24.2;Summary;322
25;APPENDIX Alice in Use-Case Land;323
25.1;Introduction;323
25.2;Part 1;324
25.2.1;Alice Falls Asleep While Reading;325
25.2.2;The Promise of Use Case Driven Development;325
25.2.3;An Analysis Model Links Use-Case Text with Objects;325
25.2.4;Simple and Straightforward;326
25.2.5;<> or <>;326
25.2.6;We’re Late! We Have to Start Coding!;326
25.2.7;Alice Wonders How to Get from Use Cases to Code;327
25.2.8;Abstract... Essential;327
25.2.9;A Little Too Abstract?;327
25.2.10;Teleocentricity...;327
25.2.11;Are We Really Supposed to Specify All This for Every Use Case?;328
25.3;Part 2;330
25.3.1;Alice Gets Thirsty;330
25.3.2;Alice Feels Faint;331
25.3.3;Imagine... (with Apologies to John Lennon);331
25.3.4;Pair Programming Means Never Writing Down Requirements;332
25.3.5;There’s No Time to Write Down Requirements;333
25.3.6;You Might As Well Say, “The Code Is the Design”;333
25.3.7;Who Cares for Use Cases?;334
25.3.8;C3 Project Terminated;335
25.3.9;OnceAndOnlyOnce?;337
25.3.10;Alice Refuses to Start Coding Without Written Requirements;337
25.3.11;You Are Guilty of BDUF...;338
25.3.12;CMM’s Dead! Off with Her Head!;339
25.3.13;Some Serious Refactoring of the Design;339
25.4;Part 3;340
25.4.1;Alice Wakes Up;341
25.4.2;Closing the Gap Between “What” and “How”;341
25.4.3;Static and Dynamic Models Are Linked Together;341
25.4.4;Behavior Allocation Happens on Sequence Diagrams;341
25.4.5;And the Moral of That Is…;342
26;EPILOGUE ’Twas Brillig and the Slithy Tests…;343
27;Index;347



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.