Usable Cyber Trust Indicators

  • Lorrie Faith Cranor | Carnegie Mellon University

When systems rely on a “human in the loop” to carry out a security-critical function, cyber trust indicators are often employed to communicate when and how to perform that function. Cyber trust indicators typically serve as warnings or status indicators that communicate information, remind users of information previously communicated, and influence user behavior. They include a variety of security- and privacy-related symbols in the operating system status bar or browser chrome, pop-up alerts, security control panels, or symbols embedded in web content. However, a growing body of literature has found the effectiveness of many of these indicators to be rather disappointing. It is becoming increasingly apparent that humans are a major cause of computer security failures and those security warnings and other cyber trust indicators are doing little to prevent humans from making security errors. Indeed, engineers often fail to consider human behavior as they design secure systems. In some cases, it may be possible to redesign systems to minimize the need for humans to perform security-critical functions, thus reducing or eliminating the need for security warnings. However, in many cases it may be too expensive or difficult to automate security-critical tasks, and systems may need to rely on human judgment. In these cases, it is important to situate cyber trust indicators both spatially and temporally to maximize their effectiveness, and to design them to communicate clearly to users. Our research has studied the effectiveness of cyber trust indicators and developed approaches to making these indicators most effective and usable. I will discuss our overall approach and describe our efforts to design and evaluate a better SSL certificate warning and a “nutrition label” for privacy.

Speaker Details

Lorrie Faith Cranor is an Associate Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS). She is also Chief Scientist of Wombat Security Technologies, Inc. She has authored over 80 research papers on online privacy, phishing and semantic attacks, spam, electronic voting, anonymous publishing, usable access control, and other topics. She has played a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability (O’Reilly 2005) and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P (O’Reilly 2002). She has served on a number of boards, including the Electronic Frontier Foundation Board of Directors, and on the editorial boards of several journals. In 2003 she was named one of the top 100 innovators 35 or younger by Technology Review magazine. She was previously a researcher at AT&T-Labs Research and taught in the Stern School of Business at New York University.

    • Portrait of Jeff Running

      Jeff Running