Stop automated access (by bots) by setting challenges that people can solve but are hard to automate. It is also known as CAPTCHA, Completely Automated Public Turing test to tell Computers and Humans Apart.
An example of this could be asking someone to identify elements in a photo or audio clip.
IF thinks that preventing automated sign-ins will become more difficult but also more critical, as machine learning algorithms improve. When choosing how to implement this pattern take into account accessibility and friction. Answers to image and audio challenges can be used to train machine learning algorithms, without people knowing. This is increasingly controversial as it’s unclear what the data is being used for. This pattern tends to be used with prevent automated sign-in behaviour analysis. Consider using other authentication patterns to prevent automated sign ins, such as multi-factor authentication using text message or biometric authentication.
Read more: The inaccessibilities of CAPTCHA.
- Reduces the impact of automated access to systems, such as denial of service, spam or fake users.
- Difficult to use for people who depend on assistive technologies.
- Can be too difficult for people to solve, increasing friction and preventing access.
- Some challenges can be solved by a machine.
- Attackers might send challenges to people to solve via another channel, without them knowing what it’s for.
- You can pay companies to solve some text and image challenges in bulk.
Uses risk based analysis to detect abusive traffic on websites
An alternative to reCAPTCHA that also provides data labelling services for training data sets
Protects users from fraud using challenge-response mechanisms that computer vision can’t solve
Was this pattern useful?