Understanding and influencing decisions

Notice of upcoming action

A prompt on a mobile device reading Your direct debit is due in 10 days and underneath a button reading Update payment.


People get a notification about an upcoming automated action. They can review, adjust or cancel the automated action before it happens.

For example, if someone has set up automatic bill payments, they are alerted before the payment goes through with enough time to review payment details.

IF thinks this pattern is useful to keep people informed of upcoming automated actions made on their behalf and give them time to review or change it if they need to. This pattern is useful when combined with other patterns that help people challenge an automated action, or appeal for a human review.

In addition, ICO guidelines on GDPR compatibility state that any patterns involving automated decision-making must “introduce simple ways for [people] to request human intervention or challenge a decision.”


  • Gives people awareness and control over automated decisions and actions.
  • Can be low friction for people, especially if it’s a regular action or low-risk event.


  • Relies on the people seeing the notification to be able to act on it.
  • People need to understand the reason for the automated action in order to feedback meaningfully.
  • Inappropriate if the action is unexpected or high risk, as a lack of response is considered consent for the action to go ahead.


  • Chip, an automated savings service →

    Chip is a service that integrates with someone’s bank account to help them save money by actively pulling money out of their connected accounts. It sends an alert to customers before pulling money to give them a few days notice to review and edit any of these transactions.

  • Open APIs for telecoms →

    IF collaborated with the Open Data Institute (ODI) to research how open APIs in the telecoms sector could lead to new classes of commercial products. In the context of these new use cases IF identified and implemented new patterns for explaining data flows and upcoming automated actions.

  • Algorithmic transparency →

    IF presented recommendations to a parliamentary committee on how to make algorithms more transparent. In context of a fictional benefits service, IF developed prototypes to show how and why organisations need to explain automated decisions, especially when vulnerable groups are likely to be affected.