Pattern 11E: Map user behaviors to system outputs

Problem

The user needs an explanation for why the system did what it did.

Solution

Provide an explanation that enables the user to infer a connection between user behaviors and the system’s decision(s). Often used together with G11-D: Map system input attributes to system outputs.

Use when

  • It is possible to map user behaviors to system output. If system decisions depend on both input attributes and user behaviors, consider combining with G11-D: Map system input attributes to system outputs.
  • The user wants transparency into the system’s general reasoning or a specific system action.
  • Policy or regulations require the system to make an explanation available.

How

Collaborate with an AI/ML practitioner to collect information about how user behaviors inform system decisions:

  • Identify which user behaviors the system uses as inputs for determining its decisions.
  • Retrieve the most important user behaviors for the system’s output.

The explanation might cover a specific system decision (see G11-A: Local explanations​​​​​​​) or general system behavior (see G11-B: Global explanations).

The content of either local or global explanations can include:

  • Explicit and/or implicit past user behaviors.
  • Past behaviors of either this user or other users like them.
  • A general summary of behaviors.​​​​​​​

The content of local explanations can also include specific examples of past behaviors (see G11-F: Example-based explanations).

The content of global explanations can also include types of user behavior the system uses as inputs for determining its decisions.

User benefits

  • Enables the user to understand how the system connects user behaviors to system decisions.
  • Facilitates user understanding by showing specific examples of past behaviors—showing rather than telling.
  • Gives the user insights into the system’s reasoning.
  • Enables the user to alter their behaviors to influence future system outputs.
  • Enhances user trust because of system transparency.

Common pitfalls

  • It’s unclear why the important user behaviors are important.
  • General summaries of past behaviors are too vague.
  • It’s difficult for the user to infer the connection between user behaviors and system outputs.
  • The reasoning behind the system’s decision is invalid, even though the decision itself is relevant to the user.
  • Too much information in an explanation can be overwhelming to the user.

Note: When leveraging user behaviors to determine system outputs, do so in a privacy-aware way. Consider mitigations, such as limiting the amount of information collected and providing users the option to control which of their behaviors the system will monitor (see Guideline 17, Provide global controls).

Examples

Guideline 11 > Pattern G11-E > Example
card example thumbnail
Guideline 11 > Pattern G11-E > Example
card example thumbnail