England | Whole Body Interaction | E-Book | www2.sack.de
E-Book

E-Book, Englisch, 212 Seiten

Reihe: Human-Computer Interaction Series

England Whole Body Interaction


1. Auflage 2011
ISBN: 978-0-85729-433-3
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, 212 Seiten

Reihe: Human-Computer Interaction Series

ISBN: 978-0-85729-433-3
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark



Whole Body Interaction is 'The integrated capture and processing of human signals from physical, physiological, cognitive and emotional sources to generate feedback to those sources for interaction in a digital environment' (England 2009).Whole Body Interaction looks at the challenges of Whole Body Interaction from the perspectives of design, engineering and research methods. How do we take physical motion, cognition, physiology, emotion and social context to push boundaries of Human Computer Interaction to involve the complete set of human capabilities? Through the use of various applications the authors attempt to answer this question and set a research agenda for future work.Aimed at students and researchers who are looking for new project ideas or to extend their existing work with new dimensions of interaction.  

England Whole Body Interaction jetzt bestellen!

Autoren/Hrsg.


Weitere Infos & Material


1;Whole Body Interaction;4
2;Preface;6
3;Acknowledgements;8
4;Contents;10
5;Contributors;12
6;Chapter 1: Whole Body Interaction: An Introduction;18
6.1;Development of Technology;19
6.2;Challenges for Whole Body Interaction;20
6.2.1;Design for Whole Body Interaction;20
6.2.2;Engineering of Interaction;21
6.2.3;Research Philosophy;21
6.3;References;22
7;Chapter 2: Springboard: Designing Image Schema Based Embodied Interaction for an Abstract Domain;23
7.1;Introduction;23
7.2;Theoretical Underpinning;24
7.2.1;Embodied Metaphor Theory;24
7.2.2;The Meanings of Balance;25
7.3;Design Rationale and System Implementation;26
7.3.1;An Abstract Domain: Social Justice;26
7.3.2;Design Goals;27
7.3.3;The Springboard Whole Body Interactive Environment;28
7.4;User Study Methodology;30
7.4.1;Study Details;30
7.4.2;Qualitative Data Collection and Analysis;31
7.5;Results and Discussion;31
7.5.1;General Attitude to Springboard;31
7.5.2;Enactment Versus Verbalization of Image Schemas;32
7.5.3;Image Schemas in Mental Models of Interaction;32
7.5.4;Image Schemas in Interpretation;33
7.5.5;Spatial Versus Body-Centric Balance Image Schemas;33
7.6;Conclusions;34
7.7;References;34
8;Chapter 3: Whole Body Interaction in Abstract Domains;35
8.1;Introduction;35
8.2;Conceptual Metaphor and Embodied Experience;36
8.2.1;A Complex Abstract Domain: Tonal Harmony;38
8.2.1.1;Sensory Motor Contingency Theory;39
8.2.1.2;Lessons from Dalcroze Eurhythmics;40
8.3;Harmony Space – A System for Exploring Tonal Harmony;41
8.3.1;System Details;41
8.3.2;Conceptual Metaphors and Blends in Harmony Space;41
8.4;Whole Body Interaction vs. Desktop;45
8.4.1;Asymmetrical Collaboration;45
8.4.2;Empirical Observations;46
8.5;Conclusions;48
8.6;References;49
9;Chapter 4: Mirrored Motion: Augmenting Reality and Implementing Whole Body Gestural Control Using Pervasive Body Motion Capture Based on Wireless Sensors;51
9.1;Introduction to Motion Capture;51
9.2;Background to Motion Capture;52
9.3;System Design;53
9.3.1;Connectivity and Power;54
9.3.2;Wearability and Performance;54
9.3.3;The Sensor Network;55
9.4;Whole Body HCI;57
9.4.1;Introduction to an Augmented Reality Application;58
9.4.2;Background to Augmented Reality;59
9.4.3;System Design;59
9.4.4;Gesture Recognition;60
9.4.5;Extending Reality Using Whole Body Interaction;63
9.5;Conclusions and Further Work;64
9.6;References;65
10;Chapter 5: Sharing and Stretching Space with Full Body Tracking;67
10.1;Introduction;67
10.2;Previous Work;69
10.3;Sharing Space with Virtual Humans;70
10.3.1;Avatars and Self-Representation;71
10.3.2;Puppeteering;71
10.3.3;Virtual Human Presentation;72
10.4;Stretching Space in Virtual Environments;74
10.5;Conclusion;77
10.6;References;77
11;Chapter 6: Waggling1 the Form Baton2: Analyzing Body-Movement-Based Design Patterns in Nintendo Wii Games, Toward Innovation of New Possibilities for Social and Emotional Experience;79
11.1;Introduction;79
11.2;Research Strategy;80
11.3;Analysis;82
11.3.1;Kinesthetic Mimicry;82
11.3.2;Tensions Between Precision and Loose Movement Style in Design;84
11.3.3;Piecemeal Versus Full Body Motion;85
11.3.4;Laban Effort Dimensions and Engagement;86
11.3.5;Social Interaction;87
11.3.6;Movement and Its Relation to Story and Game World;87
11.4;Conclusions;88
11.5;References;88
12;Chapter 7: Exploring Bodily Engaging Artifacts Among Golfers, Skaters and Dancers;90
12.1;Introduction;90
12.2;Studies of Golfers, Skateboarders and Body Buggers;91
12.2.1;Golfers and Skateboarders on Body, Artifact and Movement;92
12.2.2;The Role of the Artifact in Golf;92
12.2.3;Golfer Talk on Experience and Body;93
12.2.4;Skateboarders on Body and Movement;94
12.2.5;Describing the Movements of Friends;95
12.2.6;Experiencing Movement with the BodyBug;95
12.2.7;The Role of the Artifact;97
12.3;Discussion;98
12.4;Reflections;99
12.5;Current and Future Work;100
12.6;References;100
13;Chapter 8: Whole Body Large Display Interfaces for Users and Designers;101
13.1;Why Whole Body Interaction;101
13.2;Related Work;103
13.2.1;Large Surfaces and Large Displays;103
13.2.2;Body-Centered Interaction;104
13.3;How Can Body-Centered Interaction Help Users?;104
13.3.1;Interaction Spaces;105
13.3.2;Social Conventions;106
13.3.3;Example Interaction Techniques;106
13.4;How Can Body-Centered Interaction Help Designers?;107
13.4.1;From Device-Centered to Body-Centered;107
13.4.2;Body-Centered APIs;108
13.5;A Body-Centric Application Case Study;109
13.5.1;Application Universality;110
13.5.2;Event Management;111
13.5.3;User Observations;112
13.6;Conclusions;112
13.7;References;113
14;Chapter 9: Observations on Experience and Flow in Movement-Based Interaction;115
14.1;Introduction;115
14.1.1;Whole Body Movement Interfaces;116
14.1.2;About This Chapter;117
14.2;Exertion and Entertainment Interfaces;118
14.3;Intelligent Exertion Interfaces;120
14.3.1;More Advanced Sensing of User and Activities;121
14.3.2;Flow and Immersion in Games;122
14.3.3;Flow and Immersion in Exertion Interfaces;125
14.4;Joint and Coordinated Activities in Exertion Interaction;128
14.5;Conclusions;130
14.6;References;130
15;Chapter 10: Capacitive Sensors for Whole Body Interaction;134
15.1;Introduction;134
15.2;Capacitive Sensing;135
15.3;Related Work;136
15.4;A Simple Capacitive Sensor;137
15.5;Signal Processing;138
15.6;Game Controllers Built from Capacitive Sensors;140
15.7;Conclusion;144
15.8;References;145
16;Chapter 11: Towards a Whole Body Sensing Platform for Healthcare Applications;147
16.1;Introduction;148
16.2;Networking People and Everyday Objects;148
16.2.1;Body Sensors;149
16.2.2;Smart Environments;149
16.3;Measuring Behaviour;150
16.3.1;Classification of Behaviour;150
16.4;Whole Body Sensing Platform;151
16.4.1;Physiological, Inertial and Environmental Sensing Service;152
16.5;Case Studies;153
16.5.1;Physiotherapy;154
16.5.2;Arthritis and Progressive Deterioration;156
16.5.3;Fine Measurement Sensing for Progressive Deterioration;157
16.6;Conclusions and Future Work;159
16.7;References;160
17;Chapter 12: Feasibility of Using a Head-Mounted Camera to Capture Dynamic Facial Expressions During Body Movement;162
17.1;Introduction;162
17.2;Part 1: Emotion Elicitation;164
17.2.1;Participants;164
17.2.2;Head-Mounted Camera;164
17.2.3;Procedure;164
17.2.4;Measures;165
17.2.5;Data Analysis;165
17.2.6;Results;166
17.2.6.1;Proportions of Felt Trials;166
17.2.6.2;Intensities of Felt Emotions;166
17.3;Part 2: Emotion Recognition;166
17.3.1;Participants;167
17.3.2;Video Clips;167
17.3.3;Procedure;167
17.3.4;Measures;167
17.3.5;Data Analysis;167
17.3.6;Results;168
17.3.6.1;Recognition;168
17.3.6.2;Additional Measures;168
17.4;Part 3: Effort-Shape Analysis;168
17.4.1;Participants;168
17.4.2;Measures;168
17.4.3;Data Analysis;169
17.4.4;Results;169
17.4.4.1;Effect of HM;169
17.4.4.2;Additional Measures;169
17.5;Part 4: Kinematic Analysis;170
17.5.1;Kinematic Data;170
17.5.2;Measures;170
17.5.3;Data Analysis;170
17.5.4;Results;171
17.5.4.1;Gait Cycle Descriptors;171
17.5.4.2;Joint Angular Kinematics;171
17.6;Discussion;171
17.7;Conclusions;172
17.8;References;172
18;Chapter 13: Body Gestures for Office Desk Scenarios;174
18.1;Introduction;174
18.2;Understanding Current Interaction Practices at the Office Desk: Ergonomics and Input Devices vs. Natural Interaction;175
18.2.1;Desk Ergonomics;175
18.2.2;Gestures vs. Mouse and Keyboard;176
18.2.3;Gestures as Commands;177
18.2.4;Computer Vision at the Desk;177
18.2.5;Capturing Hands and Head Movements;178
18.3;Applying Natural Interaction to Practical Scenarios: Interacting with Gestures from the Desk;179
18.4;Conclusions;180
18.5;References;181
19;Chapter 14: Gesture-Based Interfaces: Practical Applications of Gestures in Real World Mobile Settings;184
19.1;Introduction;184
19.2;Understanding Gestures;185
19.3;Gestures and Social Acceptability;186
19.3.1;Social Acceptability and Gesture-Based Interfaces;187
19.3.1.1;Performance;188
19.3.1.2;User Experience and Spectators;188
19.3.1.3;Technology and Social Acceptability;189
19.3.1.4;Social Acceptability;189
19.4;Body-Based Gesturing;189
19.4.1;Testing Body-Based Discrete Action Event Control;191
19.4.2;Testing Body-Based Continuous Control;192
19.4.2.1;Wrist Rotation;192
19.4.2.2;Head Tilt;193
19.5;Conclusions;195
19.6;References;196
20;Chapter 15: Estimation of Interest from Physical Actions Captured by Familiar User Device;198
20.1;Introduction;198
20.2;Previous Work;199
20.2.1;How to Capture User Actions;199
20.3;Proposed System;200
20.3.1;Detection of the Face Approaches;201
20.4;Pilot Study and Preliminary Results;201
20.4.1;Participants;202
20.4.2;Procedure;202
20.4.3;Preliminary Results;203
20.5;Conclusion;203
20.6;References;205
21;Chapter 16: Towards a Framework for Whole Body Interaction with Geospatial Data;207
21.1;Introduction and Motivation;207
21.2;Related Work;208
21.3;Multi-Touch Interaction;209
21.3.1;Physical Multi-Touch Interactions;210
21.3.2;Interaction Primitives and Interaction Space;211
21.3.3;Framework for Multi-Touch Interaction with Geospatial Data;211
21.4;Framework Extension for Multi-Modal Interaction with GeospatialData;213
21.4.1;Multi-Touch and Foot Interaction;213
21.4.2;Gaze Interaction;213
21.4.3;Extended Framework for Foot and Eye Gaze Interaction;214
21.5;Implementation;214
21.6;Conclusion and Future Work;215
21.7;References;216
22;Index;218



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.