User Experience is a core part of what we do. Understanding how people behave online and designing around their needs and pains can make the difference between a successful website and one that fails.
Years of research and testing has gone into the development of User Centered Design practices, giving us a valuable toolkit of options to help the end user. However, with all knowledge comes responsibility, and some less ethical website owners have decided to subvert everything we know about user-centred design, turning it against the user’s best interests for their own gain.
There’s bad design, and then there’s intentionally bad design. These practices are called Dark Patterns.
Following our webinar “Lifting the lid on the dark side of on-line customer journeys” with our partners at Create Solutions, we thought it might be useful to document some of the dark patterns we see regularly, and how they can be avoided.
Labyrinth / Maze (aka Roach Motel)
Options are designed to make it very easy to sign up, or start using a product or service and disproportionately difficult to leave.
Some social media platforms are famous for making it incredibly simple to sign up for an account, but bury their deactivation or deletion options within 10 layers of menus, and use every dirty trick and all the dark patterns in the book to get you to stay. Not naming any names, of course…
Even a simple option like the example above presents several issues.
- It frustrates users by depriving them of the possibility of deleting their accounts themselves.
- It is at odds with users who want to delete the account (the site uses the word “deactivate“, implying that the site may not delete the data.
- Users are presented with a dead end – they can’t proceed to delete the account because the website now forces them to perform a different task – forcing users to write an email or use a telephone.
- It increases the amount of time the task will take, making users wait for an answer from the company in order to proceed.
Just like a real maze – it’s easy to enter the maze, but not as easy to find a way out.
Misdirection
The user’s attention is deliberately led to a part of the interface or flow that the site owner wants the user to see or interact with, so other information stays out of the viewing scope for a user or is easily overlooked.
In this dark pattern example, the option to continue to check-in (without paying for additional upgrades) is knocked-back and made secondary to the upgrade option, making it easier for users to accidentally add upgrades.
Sometimes the information is hidden below the “fold” of the screen, where users may need to scroll to find other options.
Bait and Switch
The user interacts with an element where they would expect an action to occur. Instead, a completely different and often undesirable action occurs upon interaction.
This example of a dark pattern (based on a real-world upgrade prompt for a commonly used operating system) subverts expected behaviour. Clicking ‘upgrade now’, ‘ok’ or closing the popup begins or schedules the upgrade. Everywhere else in the system, clicking the close button with ‘X’ should close the window, not confirm or engage an action.
When you break convention on expected behaviour, you break trust.
This is can range from a close-button that takes you to another webpage, to a do-not-download button that automatically starts a download.
Friend spam
Many sites now accept social media accounts as registration/login credentials, linking the two accounts together.
Some platforms will use this connection to automatically post to your social media or email people in your contact list, sometimes even without your knowledge.
These messages appear to come from you directly, or even actively pose as you.
Because friends think that you are personally endorsing a product or service, they are more likely to try it themselves.
Forced continuity
Often observed in some subscription-based services where the user is required to enter their credit card details to access a free trial, and must agree to be automatically charged should they continue beyond the trial.
The desired outcomes of this dark pattern are:
- The user will forget to cancel in time and be charged full-price every month for a service they thought they’d subscribed to for a specified duration at a reduced or free cost.
- As in the ‘Maze’, the user may find it next to impossible to delete their account/subscription in time.
Either way, if the user either forgets to cancel or gives up on cancelling and accepts the charges, more money can be squeezed out of that user.
Zuckering
Named after Facebook/Meta founder Mark Zuckerberg, probably due to the deceptive and manipulative tactics used by Facebook to harvest and monetise user data, and the subsequent scandals (e.g. Cambridge Analytica Scandal). Facebook itself is generally considered to be a hive of dark patterns.
This is a tactic that takes place mostly behind the scenes.
By having a complex and often obscure Terms and Conditions and Privacy Policies, users get “zuckered” (suckered) into giving away their information towards different data brokers, through a third party (usually involving a service, such as Meta).
Users aren’t legal experts. In reality, unless you’re a legal expert, you can’t really tell who will have access to your data and for what purposes, if you agree. “I have read and agreed” may be one of the most frequent lies told on the web.
The phrase “TL;DR” exists for this reason. Standing for Too Long; Didn’t Read, it is increasingly used to introduce the summary of an otherwise unreadable lengthy text.
Some services are starting to address this issue, with ToS;DR providing plain-language summaries for the Terms and Conditions of major industry services.
The United States Congress also has bipartisan support for the Terms-of-service Labeling, Design and Readability Act (TLDR for short) which would require websites to provide a “summary statement” for users before they opt in to a terms-of-service agreement.
Disguised ads
These adverts are designed to blend seamlessly with the website, app or any other platform on which they appear.
One of the most common forms of the disguised ad dark pattern is a download button, which is placed inside the download section of a website and draws the eye more than the genuine download link.
When a user clicks the button, they will not download the file in question but be redirected to another website or service.
Confirmshaming
Also sometimes referred to as a negative opt-out, confirmshaming is a passive-aggressive marketing strategy that implies that you are inferior just because you don’t want a particular product.
Other shamelessly transparent examples of this dark pattern include:
- No thanks, I REALLY hate saving money
- That’s ok, I don’t read
- No thanks, I’m fine with losing customers
Fear Of Missing Out (FOMO)
This dark pattern occurs when an interface induces apprehension or anxiety inusers that they are either not in-the-know or missing out on information, events, experiences, or decisions that could make their life better.
In this example, fake scarcity pressures the user into purchasing because they are presented with a false indication of limited supply or popularity.
Other versions of this could include fake urgency, with countdown timers offering a better rate that never actually expires to simply refreshes at the end of the day.
Sneaking
Also called ‘sneak into basket’.
This group of dark patterns imposes additional purchases of goods or services on users without them initially noticing.
For example, an additional object unintentionally ends up in the shopping cart and must be deliberately removed, otherwise, the item will be purchased.
Until the introduction of more stringent regulation and the FCA Consumer Duty, it was common to find insurance add-ons in your basked at some retailers.
Backwards Logic
These interfaces are deliberately designed in such a way that the desired user action is at odds with the nature of the interface.
In this example, the user must enable a switch (or set it to an enabled state) in order to disable notifications, causing uncertainty and in some cases, inaction.
Prior to GDPR, a common example was confusing language and checkboxes for marketing opt-ins, such as:
Making ethical design choices
We illustrate these practices not because we agree with them, or want people to emulate them, but to raise awareness of their intent. We hope that knowledge helps people make more ethical web design decisions when it comes to their own sites.
If you’re interested in ethical web design or digital consultancy, feel free to reach out for a chat.