Prediction
Kayak | 2C: Report system performance information
Pattern 15C: Reporting inappropriate content
Problem User feedback is needed to identify problematic or inappropriate system outputs. Solution Implement a user-feedback mechanism at each item or instance of system output, enabling the user to flag output that is problematic, wrong, offensive, biased, or otherwise inappropriate. Leverage user feedback to identify biased, offensive, or otherwise inappropriate system outputs. Use when How […]
Pattern 15B: Request explicit feedback on selected system outputs
Problem User feedback is needed to assess the system and help it improve over time. Solution Implement a user-feedback mechanism that occasionally asks the user to provide explicit feedback for selected items or instances of system outputs. The system initiates the feedback interaction. Leverage user feedback for: Use When How Decide what type of feedback […]
Pattern 15A: Encourage explicit feedback on individual system outputs
Problem User feedback is needed to assess the system and help it improve over time. Solution Implement a user-feedback mechanism that enables the user to provide explicit feedback at each item or instance of system output. The user initiates the feedback interaction. Leverage user feedback for: Use when How Decide what type of feedback to […]
Pattern 14B: Immediate, partial, non-disruptive updates
Problem The system adapts in response to user interaction and is at risk of disrupting or disorienting the user. Solution The system makes an immediate, but local update that maintains, to a large extent, the previous state. Use when How Collaborate with an AI/ML practitioner to: When making the update, consider: User benefits Common pitfalls […]
Pattern 14A: Comprehensive updates
Problem The system adapts as it learns over time and it needs to update its outputs without disrupting or disorienting the user. Solution The system makes a controlled and deliberate comprehensive update in response to user behaviors or other additional data. Use when How Update system outputs in a way that permeates the whole user […]
Pattern 11G: “What if?” explanations
Problem The user needs to understand how to change their input in order to achieve a specific system output (see G11-A: Local explanations). Solution Enable users to simulate and experiment with alternative input values that might change the system’s decision. Use when How Provide users the ability to simulate different system decisions by changing: When enabling […]
Pattern 11F: Example-based explanations
Problem The user needs an explanation for why the system did what it did and an explanation for a single instance may not be sufficient. Solution Provide one or multiple examples of instances similar to the user’s input and their corresponding explanations. Use when How Collaborate with an AI/ML practitioner to collect information about how […]
Pattern 11E: Map user behaviors to system outputs
Problem The user needs an explanation for why the system did what it did. Solution Provide an explanation that enables the user to infer a connection between user behaviors and the system’s decision(s). Often used together with G11-D: Map system input attributes to system outputs. Use when How Collaborate with an AI/ML practitioner to collect information about […]
Pattern 11D: Map system input attributes to system outputs
Problem The user needs insights into why the system did what it did. Solution Provide an explanation that enables the user to infer a connection between user behaviors and the system’s decision(s). Often used together with G11-E: Map user behaviors to system outputs. Use when How Collaborate with an AI/ML practitioner to collect information about which input […]
Pattern 11C: Present properties of system outputs
Problem The user needs an explanation for why the system did what it did. Solution Provide users with a set of relevant system-output characteristics so they can understand what properties influence the system’s behaviors. Use when How Collaborate with an AI/ML practitioner to: Show system outputs and their respective characteristics. Output characteristics may be independent […]
Pattern 11B: Global explanations
Problem The user needs an explanation for why the system did what it did, and it is important for the user to understand how the AI system works in general. Solution Make available an explanation that pertains to how the AI system makes decisions in general. Use when How Get information about how the AI […]