15 minutes

Dark Patterns in web design and how to avoid them

User Experience is a core part of what we do. Understanding how people behave online and designing around their needs and pains can make the difference between a successful website and one that fails. 

Years of research and testing has gone into the development of User Centered Design practices, giving us a valuable toolkit of options to help the end user. However, with all knowledge comes responsibility, and some less ethical website owners have decided to subvert everything we know about user-centred design, turning it against the user’s best interests for their own gain.

There’s bad design, and then there’s intentionally bad design. These practices are called Dark Patterns.

Following our webinar “Lifting the lid on the dark side of on-line customer journeys” with our partners at Create Solutions, we thought it might be useful to document some of the dark patterns we see regularly, and how they can be avoided.

Labyrinth / Maze (aka Roach Motel)

Options are designed to make it very easy to sign up, or start using a product or service and disproportionately difficult to leave.

Some social media platforms are famous for making it incredibly simple to sign up for an account, but bury their deactivation or deletion options within 10 layers of menus, and use every dirty trick and all the dark patterns in the book to get you to stay. Not naming any names, of course…

Even a simple option like the example above presents several issues.

  • It frustrates users by depriving them of the possibility of deleting their accounts themselves.
  • It is at odds with users who want to delete the account (the site uses the word “deactivate“, implying that the site may not delete the data.
  • Users are presented with a dead end – they can’t proceed to delete the account because the website now forces them to perform a different task – forcing users to write an email or use a telephone.
  • It increases the amount of time the task will take, making users wait for an answer from the company in order to proceed.

Just like a real maze – it’s easy to enter the maze, but not as easy to find a way out.

ℹ️ How to avoid: You don't have to lead users to the door, but don't intentionally hide your account closure option. If users are determined to leave, they will. You'll just leave them with a sour taste and chances are, they'll never give you another try.


The user’s attention is deliberately led to a part of the interface or flow that the site owner wants the user to see or interact with, so other information stays out of the viewing scope for a user or is easily overlooked. 

In this dark pattern example, the option to continue to check-in (without paying for additional upgrades) is knocked-back and made secondary to the upgrade option, making it easier for users to accidentally add upgrades.

Sometimes the information is hidden below the “fold” of the screen, where users may need to scroll to find other options.

ℹ️ How to avoid: Keep your design consistent and clear, and leave ambiguous or misleading elements out. Users should not be in any doubt as to what an action will do.

Bait and Switch

The user interacts with an element where they would expect an action to occur. Instead, a completely different and often undesirable action occurs upon interaction. 

This example of a dark pattern (based on a real-world upgrade prompt for a commonly used operating system) subverts expected behaviour. Clicking ‘upgrade now’, ‘ok’ or closing the popup begins or schedules the upgrade. Everywhere else in the system, clicking the close button with ‘X’ should close the window, not confirm or engage an action.

When you break convention on expected behaviour, you break trust.

This is can range from a close-button that takes you to another webpage, to a do-not-download button that automatically starts a download.

ℹ️ How to avoid: At this risk of being repetitive, keep the interface consistent and language clear. Based on established convention, users should be clear on the result of their action. It takes serious effort to mislead users in this way.

Friend spam

Many sites now accept social media accounts as registration/login credentials, linking the two accounts together.

Some platforms will use this connection to automatically post to your social media or email people in your contact list, sometimes even without your knowledge. 

These messages appear to come from you directly, or even actively pose as you. 

Because friends think that you are personally endorsing a product or service, they are more likely to try it themselves.

ℹ️ How to avoid: Users are more wise to spam than ever. Rather than automating this process, prompt the users to recommend your product or service in their own words. Referral schemes that require user action won't send as many referrals, but might result in better quality leads.

Forced continuity

Often observed in some subscription-based services where the user is required to enter their credit card details to access a free trial, and must agree to be automatically charged should they continue beyond the trial. 

The desired outcomes of this dark pattern are: 

  1. The user will forget to cancel in time and be charged full-price every month for a service they thought they’d subscribed to for a specified duration at a reduced or free cost.
  2. As in the ‘Maze’, the user may find it next to impossible to delete their account/subscription in time.

Either way, if the user either forgets to cancel or gives up on cancelling and accepts the charges, more money can be squeezed out of that user.

ℹ️ How to avoid: Although it's tempting to lock in revenues, you could either not require a credit card for trials (no credit card required can be a draw for some) or be more proactive about reminding customers their trial is coming to an end. If your service has value, trials should convert to customers.


Named after Facebook/Meta founder Mark Zuckerberg, probably due to the deceptive and manipulative tactics used by Facebook to harvest and monetise user data, and the subsequent scandals (e.g. Cambridge Analytica Scandal). Facebook itself is generally considered to be a hive of dark patterns.

This is a tactic that takes place mostly behind the scenes. 

By having a complex and often obscure Terms and Conditions and Privacy Policies, users get “zuckered” (suckered) into giving away their information towards different data brokers, through a third party (usually involving a service, such as Meta).

Users aren’t legal experts. In reality, unless you’re a legal expert, you can’t really tell who will have access to your data and for what purposes, if you agree. “I have read and agreed” may be one of the most frequent lies told on the web.

The phrase “TL;DR” exists for this reason. Standing for Too Long; Didn’t Read, it is increasingly used to introduce the summary of an otherwise unreadable lengthy text.

Some services are starting to address this issue, with ToS;DR providing plain-language summaries for the Terms and Conditions of major industry services. 

The United States Congress also has bipartisan support for the Terms-of-service Labeling, Design and Readability Act (TLDR for short) which would require websites to provide a “summary statement” for users before they opt in to a terms-of-service agreement.

ℹ️ TLDR? Simplify your terms and conditions, or create a summary.

Disguised ads

These adverts are designed to blend seamlessly with the website, app or any other platform on which they appear.

One of the most common forms of the disguised ad dark pattern is a download button, which is placed inside the download section of a website and draws the eye more than the genuine download link. 

When a user clicks the button, they will not download the file in question but be redirected to another website or service. 

ℹ️ How to avoid: Except in specific circumstances, most websites are advertising their own product or service. If your site features third-party advertising, you should maintain guidelines for your advertisers to ensure adverts are obviously distinct from your site, or put them inside clearly demarcated areas.


Also sometimes referred to as a negative opt-out, confirmshaming is a passive-aggressive marketing strategy that implies that you are inferior just because you don’t want a particular product.

Other shamelessly transparent examples of this dark pattern include:

  • No thanks, I REALLY hate saving money
  • That’s ok, I don’t read
  • No thanks, I’m fine with losing customers
ℹ️ How to avoid: Keep options simple, and get creative with your positive language, not negative. Don't be passive-aggressive.

Fear Of Missing Out (FOMO)

This dark pattern occurs when an interface induces apprehension or anxiety inusers that they are either not in-the-know or missing out on information, events, experiences, or decisions that could make their life better.

In this example, fake scarcity pressures the user into purchasing because they are presented with a false indication of limited supply or popularity.

Other versions of this could include fake urgency, with countdown timers offering a better rate that never actually expires to simply refreshes at the end of the day.

ℹ️ How to avoid: This is entirely more ethical and useful to a customer if you can get real data into the interface. If there are genuinely only 3 left, that's useful and not false scarcity. Otherwise, just rely on stock indication (in stock, out of stock).


Also called ‘sneak into basket’.

This group of dark patterns imposes additional purchases of goods or services on users without them initially noticing. 

For example, an additional object unintentionally ends up in the shopping cart and must be deliberately removed, otherwise, the item will be purchased.

Until the introduction of more stringent regulation and the FCA Consumer Duty, it was common to find insurance add-ons in your basked at some retailers.

ℹ️ How to avoid: Don't put anything into the basket automatically that the user hasn't requested, unless it has zero cost. Promote or upsell additional items as opt-in, not opt-out.

Backwards Logic

These interfaces are deliberately designed in such a way that the desired user action is at odds with the nature of the interface.

In this example, the user must enable a switch (or set it to an enabled state) in order to disable notifications, causing uncertainty and in some cases, inaction.

Prior to GDPR, a common example was confusing language and checkboxes for marketing opt-ins, such as:

ℹ️ How to avoid: Simply - keep your language simple and use logic in alignment with the interface. Enable switchers to enable functionality. Tick boxes to agree. And so on.

Making ethical design choices

We illustrate these practices not because we agree with them, or want people to emulate them, but to raise awareness of their intent. We hope that knowledge helps people make more ethical web design decisions when it comes to their own sites.

If you’re interested in ethical web design or digital consultancy, feel free to reach out for a chat.

the author

Larry Brangwyn

Larry is a published UX specialist with an extensive track record of creating award-winning online solutions.
Tags:Dark Patterns, Ethical Design, Ethics, FCA Consumer Duty, UX

Need help or support?

We’ve all been there. Support for our website customers is now managed on a ticketing system, to make sure you get the help you need as quickly as possible.

Sending an email automatically creates a support ticket.