Pattern 11B: Global explanations

Problem

The user needs an explanation for why the system did what it did, and it is important for the user to understand how the AI system works in general.

Solution

Make available an explanation that pertains to how the AI system makes decisions in general.

Use when

  • The user wants transparency into the system’s reasoning about a specific action.
  • Policy or regulations require the system to make available an explanation.

How

Get information about how the AI system makes decisions. See patterns G11-B–G for different explanation styles.

If a global explanation isn’t possible (e.g., like when the AI system doesn’t pass information to the UI that’s useful for global explanation), consider advocating with your team to pursue known methods for generating such explanations.

Ensure that the representation communicates that the explanation is specific to that AI system.

User benefits

  • Enables the user to update their understanding of AI system decisions.
  • Enables the user to understand the system’s reasoning.

Common pitfalls

  • Overly generic/vague explanations that don’t explain this specific AI system.
  • Explanations that don’t accurately represent how the AI system works. Consider consulting with an expert who can assess the functionality of the AI and validate whether the explanation is sufficiently accurate.
  • Too much information in an explanation can be overwhelming to the user.
  • The explanation portrays the AI system as more capable than it is or mysterious and impossible to understand (e.g., “magic”).

Examples

Guideline 11 > Pattern G11-B > Example
card example thumbnail
Guideline 11 > Pattern G11-B > Example
card example thumbnail
Guideline 11 > Pattern G11-B > Example
card example thumbnail