Can computers change what you think and do? Can they motivate you to stop smoking, persuade you to buy insurance, or convince you to join the Army?
"Yes, they can," says Dr. B. J. Fogg, director of the Persuasive Technology Lab at Stanford University. Fogg has coined the phrase "Captology"(an acronym for computers as persuasive technologies) to capture the domain of research, design, and applications of persuasive computers. In this thought-provoking book, based on nine years of research in captology, Dr. Fogg reveals how Web sites, software applications, and mobile devices can be used to change people's attitudes and behavior. Technology designers, marketers, researchers, consumers-anyone who wants to leverage or simply understand the persuasive power of interactive technology-will appreciate the compelling insights and illuminating examples found inside.
Persuasive technology can be controversial-and it should be. Who will wield this power of digital influence? And to what end? Now is the time to survey the issues and explore the principles of persuasive technology, and B. J. Fogg has written this book to be your guide.
* Filled with key term definitions in persuasive computing
*Provides frameworks for understanding this domain
*Describes real examples of persuasive technologies
Inhaltsverzeichnis
1;Cover;1 2;Foreword;9 3;Contents;15 4;Preface;23 5;Acknowledgments;26 6;Introduction: Persuasion in the Digital Age;29 6.1;Persuasion on the Web;30 6.2;Beyond the Web;30 6.3;The Emergence of Captology;33 6.3.1;Potential and Pitfalls;33 6.4;Advantage over Traditional Media: Interactivity;34 6.5;Advantages over Human Persuaders;35 6.5.1;1. Computers Are Persistent;35 6.5.2;2. Computers Allow Anonymity;36 6.5.3;3. Computers Can Store, Access, and Manipulate Huge Volumes of Data;36 6.5.4;4. Computers Can Use Many Modalities;37 6.5.5;5. Computer Software Can Scale;38 6.5.6;6. Computers Can Be Ubiquitous;38 6.6;How to Read This Book;39 6.7;Notes and References;40 7;Chapter 1 Overview of Captology;43 7.1;Defining Persuasion;43 7.2;Focus on the Human-Computer Relationship;44 7.3;Persuasion Is Based on Intentions, Not Outcomes;44 7.4;Levels of Persuasion: Macro and Micro;45 7.4.1;Microsuasion on the Web;47 7.4.2;Microsuasion in Video Games;47 7.5;Captology: Summary of Key Terms and Concepts;48 7.6;Notes and References;48 8;Chapter 2 The Functional Triad: Computers in Persuasive Roles;51 8.1;The Functional Triad: Roles Computers Play;51 8.1.1;Computers as Tools;52 8.1.2;Computers as Media;53 8.1.3;Computers as Social Actors;54 8.2;Applying the Functional Triad to Captology;55 8.2.1;Research and Design Applications;55 8.3;Notes and References;57 9;Chapter 3 Computers as Persuasive Tools;59 9.1;Seven Types of Persuasive Technology Tools;60 9.2;Reduction Technology: Persuading through Simplifying;61 9.2.1;Simplifying Political Input;62 9.3;Tunneling Technology: Guided Persuasion;62 9.3.1;Ethical Concerns;65 9.4;Tailoring Technology: Persuasion through Customization;65 9.4.1;Ethical Concerns;68 9.4.2;Tailoring Information for Context ;68 9.5;Suggestion Technology: Intervening at the Right Time;69 9.5.1;Timing Is Critical;71 9.6;Self-Monitoring Technology: Taking the Tedium Out of Tracking;72 9.6.1;Eliminating a Language Quirk;73 9.7;Surveillance Technology: Persuasion through Ob
servation;74 9.7.1;Surveillance Must Be Overt;75 9.7.2;Rewarding through Surveillance;76 9.7.3;Public Compliance without Private Acceptance;77 9.8;Conditioning Technology: Reinforcing Target Behaviors;77 9.8.1;Technology Applications of Operant Conditioning;78 9.8.2;Operant Conditioning in Computer Games;79 9.8.3;Applying Periodic Reinforcement;79 9.8.4;Shaping Complex Behaviors;81 9.9;The Right Persuasive Tool(s) for the Job;81 9.10;Notes and References;82 10;Chapter 4 Computers as Persuasive Media: Simulation;89 10.1;Persuading through Computer Simulation;90 10.2;Cause-and-Effect Simulations: Offering Exploration and Insight;91 10.2.1;HIV Roulette: A Cause-and-Effect Simulator;92 10.2.2;Rocketts New School: Learning Social Skills;94 10.2.3;Implications of Designer Bias;95 10.3;Environment Simulations: Creating Spaces for Persuasive Experiences;97 10.3.1;LifeFitness VR Rowing Machine: Competing in a Virtual Environment;98 10.3.2;The Tectrix VR Bike: Pedaling to Explore a Virtual Environment;98 10.3.3;Managing Asthma in a Simulated Environment;100 10.3.4;Using Simulation to Overcome Phobias;102 10.3.5;In My Steps: Helping Doctors to Empathize with Cancer Patients;104 10.4;Object Simulations: Providing Experiences in Everyday Contexts;105 10.4.1;Baby Think It Over: An Infant Simulator;106 10.4.2;Drunk Driving Simulator;107 10.5;Notes and References ;110 11;Chapter 5 Computers as Persuasive Social Actors;117 11.1;Five Types of Social Cues;118 11.2;Persuasion through Physical Cues;119 11.2.1;The Impact of Physical Attractiveness;120 11.3;Using Psychological Cues to Persuade;122 11.3.1;The Stanford Similarity Studies;123 11.3.1.1;The Personality Study;123 11.3.1.2;The Affiliation Study;126 11.3.2;Ethical and Practical Considerations;128 11.3.2.1;The Oscilloscope Study;128 11.4;Influencing through Language;129 11.4.1;Persuading through Praise;131 11.5;Social Dynamics;133 11.5.1;The Reciprocity Study;136 11.6;Persuading by Adopting Social Roles;139 11.6.1;Computers in Rol
es of Authority;139 11.7;Social Cues: Handle with Care;142 11.8;Notes and References;143 12;Chapter 6 Credibility and Computers;149 12.1;What Is Credibility?;150 12.1.1;A Simple Definition;150 12.1.1.1;Trustworthiness;151 12.1.1.2;Expertise;152 12.1.1.3;Combinations of Trustworthiness and Expertise;152 12.2;When Credibility Matters in Human-Computer Interaction;153 12.2.1;Instructing or Advising;154 12.2.2;Reporting Measurements;155 12.2.3;Providing Information and Analysis;156 12.2.4;Reporting on Work Performed;156 12.2.5;Reporting on Their Own State ;157 12.2.6;Running Simulations;158 12.2.7;Rendering Virtual Environments;158 12.3;Four Types of Credibility;159 12.3.1;Presumed Credibility;160 12.3.2;Surface Credibility;160 12.3.3;Reputed Credibility;163 12.3.4;Earned Credibility;164 12.4;Dynamics of Computer Credibility;165 12.5;Errors in Credibility Evaluations;167 12.6;Appropriate Credibility Perceptions;168 12.7;The Future of Computer Credibility;169 12.8;Notes and References;169 13;Chapter 7 Credibility and the World Wide Web;175 13.1;The Importance of Web Credibility;176 13.2;Variability of Web Credibility;176 13.3;Two Sides of Web Credibility;177 13.4;The Stanford Web Credibility Studies;178 13.4.1;A Few Words about Our Findings;180 13.4.2;Interpreting the Data;183 13.5;Trustworthiness and Expertise on the Web;184 13.5.1;Trustworthiness and Web Credibility;184 13.5.1.1;Elements that Increase Credibility: Significant Changes in 2002 Results;185 13.5.1.2;Elements that Decrease Credibility: Significant Changes in 2002 Results;187 13.5.2;Expertise and Web Site Credibility;188 13.5.2.1;Elements that Increase Credibility: Significant Changes in 2002 Results;189 13.5.2.2;Elements that Decrease Credibility: No Significant Changes in 2002;190 13.6;The Four Types of Web Credibility;191 13.6.1;Presumed Credibility on the Web;191 13.6.2;Reputed Credibility on the Web;193 13.6.2.1;Awards;193 13.6.2.2;Seals of Approval ;193 13.6.2.3;Links from Credible Sources;194 13.6.2.4
;Word-of-Mouth Referrals;195 13.6.3;Surface Credibility on the Web;195 13.6.3.1;Design Matters;195 13.6.3.2;Enhancing Surface Credibility;197 13.6.4;Earned Credibility on the Web;198 13.6.4.1;The Interaction Is Easy;199 13.6.4.2;The Information Is Personalized;200 13.6.4.3;The Service Is Responsive to Customer Issues;200 13.7;The Web Credibility Framework;201 13.8;The Web Credibility Grid;203 13.9;The Future of Web Credibility Research and Design;204 13.10;Notes and References;205 14;Chapter 8 Increasing Persuasion through Mobility and Connectivity;211 14.1;Intervening at the Right Time and Place;211 14.1.1;The Study Buddy;211 14.1.2;HydroTech;212 14.2;An Emerging Frontier for Persuasive Technology;213 14.3;Persuasion through Mobile Technology;213 14.3.1;Examining Mobile Health Applications;214 14.3.2;The Kairos Factor;215 14.3.3;The Convenience Factor;216 14.3.4;Simplifying Mobile Devices to Increase Persuasion Power;218 14.3.5;Wedded to Mobile Technology;220 14.3.5.1;Motivating Users to Achieve Their Own Goals;221 14.3.5.2;The Importance of Experience Design;222 14.4;Persuasion through Connected Technology;223 14.4.1;Leveraging Current, Contingent, and Coordinated Information;223 14.4.2;Connected Products: Leveraging Social Influence;225 14.4.2.1;Persuading through Social Facilitation;225 14.4.2.2;The Power of Social Comparison;226 14.4.2.3;Leveraging Conformity - and Resistance ;227 14.4.2.4;Applying Social Learning Theory;229 14.4.2.4.1;Modeling Behavior at QuitNet.com;229 14.4.2.4.2;Modeling at epinions.com;232 14.4.3;Persuading through Intrinsic Motivation;232 14.4.3.1;AlternaTV: Leveraging Group-Level Intrinsic Motivators;233 14.5;The Future of Mobile and Connected Persuasive Technology;235 14.6;Notes and References;236 15;Chapter 9 The Ethics of Persuasive Technology;239 15.1;Is Persuasion Unethical?;240 15.2;Unique Ethical Concerns Related to Persuasive Technology;241 15.2.1;1. The Novelty of the Technology Can Mask Its Persuasive Intent;241 15.2.2;2. Persu
asive Technology Can Exploit the Positive Reputation of Computers;243 15.2.3;3. Computers Can Be Proactively Persistent;244 15.2.4;4. Computers Control the Interactive Possibilities;244 15.2.5;5. Computers Can Affect Emotions But Cant Be Affected by Them;245 15.2.6;6. Computers Cannot Shoulder Responsibility;246 15.3;Intentions, Methods, and Outcomes: Three Areas Worthy of Inquiry;248 15.3.1;Intentions: Why Was the Product Created?;248 15.3.2;Methods of Persuasion;249 15.3.2.1;Using Emotions to Persuade;250 15.3.2.2;Methods That Always Are Unethical;251 15.3.2.3;Methods That Raise Red Flags;252 15.3.2.3.1;Operant Conditioning;252 15.3.2.3.2;Surveillance;254 15.3.3;Outcomes: Intended and Unintended;255 15.3.3.1;Responsibility for Unintended Outcomes;257 15.4;When Persuasion Targets Vulnerable Groups;258 15.5;Stakeholder Analysis: A Methodology for Analyzing Ethics;261 15.5.1;Step 1: List All of the Stakeholders;261 15.5.2;Step 2: List What Each Stakeholder Has to Gain;261 15.5.3;Step 3: List What Each Stakeholder Has to Lose;262 15.5.4;Step 4: Evaluate Which Stakeholder Has the Most to Gain;262 15.5.5;Step 5: Evaluate Which Stakeholder Has the Most to Lose ;262 15.5.6;Step 6: Determine Ethics by Examining Gains and Losses in Terms of Values;262 15.5.7;Step 7: Acknowledge the Values and Assumptions You Bring to Your Analysis;262 15.6;Education Is Key;263 15.7;Notes and References;263 16;Chapter 10 Captology: Looking Forward;269 16.1;Five Future Trends in Captology;271 16.1.1;Trend 1: Pervasive Persuasive Technologies;271 16.1.2;Trend 2: Growth Beyond Buying and Branding;272 16.1.2.1;Healthcare;273 16.1.2.2;Education;274 16.1.3;Trend 3: Increase in Specialized Persuasive Devices;274 16.1.4;Trend 4: Increased Focus on Influence Strategies;275 16.1.5;Trend 5: A New Focus on Influence Tactics;277 16.2;Looking Forward Responsibly;278 16.3;Notes and References;279 17;Appendix: Summary of Principles;283 18;Figure Credits;291 19;Index;295 20;About the Author ;311