You sign up for a free trial. Fourteen days pass. On day fifteen, money leaves your account. You did not choose to subscribe. But buried in the sign-up flow was a pre-ticked box, set in light grey text, informing you that by proceeding you agreed to automatic renewal at the full rate. You agreed to something you did not read, using knowledge you did not have, in a process designed to make that outcome as likely as possible.

You try to cancel. The subscription settings page has five tabs. The cancel option is not in any of them. A Google search reveals that you have to call a number between 9am and 5pm on weekdays. You call. You wait. You are told to expect a callback within three to five business days.

None of this happened by accident. Every step of that experience was designed.

Dark patterns are user interface design choices that trick or manipulate users into making decisions they did not intend to make -- signing up for subscriptions they did not want, sharing data they did not mean to share, or buying extras they did not choose. Coined by UX researcher Harry Brignull in 2010, dark patterns weaponise the same psychological mechanisms as nudging -- but in the opposite direction, systematically favouring the organisation over the user.

What are dark patterns?

In 2010, British UX researcher Harry Brignull launched a website called darkpatterns.org (now deceptive.design). His project was simple and devastating: catalogue, name and publish examples of user interface designs that deceived or manipulated users. The catalogue filled rapidly. The examples came from the most recognisable brands in the world.

Brignull’s insight was that the design community had developed sophisticated expertise in shaping user behaviour through interface design -- and that this expertise was being systematically applied against users’ interests. The same UX toolkit that could make it easy to donate to charity could make it nearly impossible to cancel a subscription. The same understanding of how people scan pages and follow visual hierarchies could be used to hide important information or make the wrong button more prominent than the right one.

Dark patterns are not design mistakes. They are design choices. The distinction matters enormously, both ethically and legally.

A dark pattern is not bad design. It is effective design deployed against the interests of the person it is supposed to serve.

The concept has since moved from UX circles into mainstream regulatory discourse. The European Union’s Digital Services Act explicitly addresses dark patterns. GDPR enforcement actions have targeted cookie consent flows designed to make rejection harder than acceptance. The US Federal Trade Commission has issued guidance and enforcement actions. Regulators have moved from naming a problem to assigning liability for it.

How dark patterns work: the psychology

Dark patterns are effective for the same reason that nudges are effective: human decision-making relies heavily on automatic, fast, low-effort cognitive processing. We scan rather than read. We follow visual hierarchies rather than considering all options equally. We default to what is easiest. We trust that the organisations we interact with are not actively working against our interests.

Dark patterns exploit each of these tendencies. They are designed to be invisible -- to look like helpful interface design while delivering outcomes that serve only the organisation. The most sophisticated dark patterns are ones that users only recognise, if ever, after the decision has already been made and the consequence has already landed.

The main types of dark pattern

1. Roach motel

Easy to get in, hard to get out. The most structurally elegant dark pattern: the sign-up flow is frictionless, the cancellation flow is a labyrinth. Amazon Prime for many years required users to navigate through five confirmation screens to cancel a subscription that could be initiated in a single click. The FTC sued Amazon over this in 2023, alleging that the cancellation process -- internally called “Iliad Flow” after the epic poem’s length -- was deliberately designed to wear users down until they gave up.[1]

LinkedIn used a similar pattern in its email invitations: signing up was immediate; discovering that you had accidentally imported and emailed your entire contact list was something many users only learned when the angry replies started arriving.

2. Forced continuity

Charging after a free trial without adequate notice or friction. The mechanism: ask for payment details to start a free trial, then convert automatically at the end of the trial period, relying on users forgetting, missing the notification, or finding cancellation too effortful. Gyms perfected this in the physical world. Software-as-a-service companies have industrialised it online.

The tell: a forced continuity dark pattern will always make cancelling harder than continuing. If the unsubscribe process requires more steps than the subscribe process, you are almost certainly looking at a dark pattern.

3. Confirmshaming

Making the opt-out option read as a self-insult. A newsletter sign-up popup with two buttons: “Yes, I want to save money” and “No, I prefer to pay full price.” A cookie banner with “I understand the risks and want a worse experience.” The technique exploits identity and loss aversion simultaneously: declining feels like an admission of something unflattering about yourself.

Confirmshaming is particularly widespread in email marketing, where it emerged as a response to declining opt-in rates. The irony is that users who feel tricked into subscribing are the least valuable subscribers: they have the highest unsubscribe rates, the lowest open rates, and the worst conversion behaviour. The dark pattern optimises for a metric (list size) that is inversely correlated with the actual goal (engaged audience).

4. Misdirection in cookie consent

Perhaps the most universally experienced dark pattern of the past decade. Under GDPR, websites in the EU are required to obtain genuine consent for non-essential tracking cookies. Genuine consent requires a real choice. Many websites respond to this requirement by making the “Accept all” button large, colourful and prominently placed, while making “Reject all” small, grey and buried inside a “Manage preferences” submenu that requires several additional clicks to navigate.

The French data protection authority CNIL has fined Google and Facebook hundreds of millions of euros for exactly this pattern. The Irish Data Protection Commission has issued enforcement actions against multiple major platforms. The legal position is now clear: a consent mechanism that makes acceptance systematically easier than rejection does not constitute genuine consent.

5. Hidden costs

Presenting a price throughout the purchase journey and then adding fees at the final step, after users have invested time, built commitment to the purchase and experienced the sunk cost of not wanting to start over. Ticketmaster’s booking fees, airline seat selection charges and hotel resort fees are the most infamous examples of an industry-wide practice. The FTC’s “junk fees” rule, introduced in 2024, specifically targets this pattern as deceptive.

6. Trick questions

Confusingly worded opt-in checkboxes designed to produce the opposite of the user’s intent. The classic form: “Uncheck this box if you do not wish to not receive marketing communications.” The double negative requires active parsing to decode. Under time pressure, in the flow of completing a purchase, most users either guess wrong or avoid the question entirely. Either outcome suits the organisation.

Dark patterns versus nudging: the ethical line

This is the question I get asked most often in our training programmes, and it deserves a direct answer.

Both nudges and dark patterns use the same psychological mechanisms: defaults, framing, friction, social proof, ordering effects. The psychological toolkit is identical. What differs is whose interests the design serves.

A nudge is transparent in principle: if you explained exactly what you were doing and why, the person would recognise it as serving their interests. A default that enrols employees in pension schemes, a cafeteria that places healthy food first, a reminder letter that uses social norms to prompt tax payment -- all of these could be shown to the people they affect, and those people would generally recognise the intent as aligned with their own.

A dark pattern fails this test. The friction-laden cancellation flow, the pre-ticked marketing consent box, the cookie banner with the hidden reject button -- none of these could be shown to users with the explanation “we designed this to serve your interests.” The design intent is concealed precisely because it would not survive disclosure.

The test I use in practice: if you had to show a user exactly how this interface was designed and why, would they thank you? If the answer is yes, you have a nudge. If the answer is no, you have a dark pattern.

Dark patterns and behavioural design

At SUE, we teach behavioural design as a discipline with an explicit ethical commitment: the tools of behavioural science should be used to help people do what they genuinely want to do, not to extract value from them against their interests. This is not idealism. It is also good business.

The evidence on dark patterns and long-term business outcomes is increasingly clear. Short-term metric gains from dark patterns are real. Long-term consequences include higher churn rates (users who feel tricked leave when they can), lower lifetime value, reputational damage as dark patterns are named and publicised, and growing regulatory exposure. The FTC, CNIL, ICO and European data protection authorities are actively investigating and fining companies for deceptive design at a scale that has moved from symbolic to genuinely material.

Organisations that deploy dark patterns are optimising for the wrong thing. They are treating behavioural science as a tool for extraction rather than a tool for genuine value creation. The SUE Influence Framework starts from the opposite premise: understand what people genuinely want, understand what is blocking them from getting it, and design to remove those barriers. That approach produces better outcomes for users and, over time, better outcomes for the organisation.

Common misconceptions about dark patterns

Misconception 1: “All pre-ticked boxes are dark patterns”

No. A default enrolment in a pension scheme is a pre-ticked box that serves the user’s interests. A pre-ticked box that signs you up for marketing emails you did not want is a dark pattern. The mechanism is the same. The ethics are determined by whose interests the default serves, not by the presence of a pre-tick.

Misconception 2: “Dark patterns are a digital problem”

Dark patterns existed long before the internet. The gym contract that requires written notice of cancellation three months before the annual renewal. The insurance policy renewal that arrives in an envelope designed to look like unimportant correspondence. The “sale” price on a product whose “original price” was never actually charged. Digital design simply made dark patterns faster to deploy, harder to avoid and easier to A/B test at scale.

Misconception 3: “Users who fall for dark patterns have themselves to blame”

This is the most dangerous misconception. It assumes that if users were simply more attentive or more sophisticated, dark patterns would not work. But dark patterns are specifically designed to exploit the cognitive limitations that all humans have under conditions of time pressure, information overload and trust. They work precisely because the human cognitive system they target is universal, not exceptional. Blaming the victim of a dark pattern for being human is not a useful framework for design or policy.

How to identify and avoid dark patterns in your own work

  1. The disclosure test. For every interface decision, ask: if I showed this to a user and explained exactly what I did and why, would they feel helped or deceived? Design only what you could disclose without shame.
  2. The symmetry test. If signing up takes three clicks, cancelling should take three clicks. If opting in is a single checkbox, opting out should be a single checkbox. Asymmetry in opposite directions is the signature of a dark pattern.
  3. Audit your cancellation flows. They reveal more about an organisation’s values than almost any other design decision. How long does it take to cancel? How many steps? How much friction? This is where dark patterns are most concentrated and most consequential.
  4. Read your cookie consent implementation. Is the “Accept all” button the same size and colour as “Reject all”? Is rejection achievable in a single click? If not, you have a regulatory risk as much as an ethical one.
  5. Track churn and complaints alongside conversion. Dark patterns improve short-term conversion metrics and worsen long-term retention and satisfaction metrics. If your optimisation process only looks at conversion, you are creating incentives for dark patterns without knowing it.

Frequently asked questions about dark patterns

What are dark patterns in UX?

Dark patterns are user interface design choices that trick or manipulate users into making decisions they did not intend to make -- signing up for subscriptions they did not want, sharing data they did not mean to share, or buying extras they did not choose. The term was coined by UX researcher Harry Brignull in 2010 to name and catalogue these practices.

Who invented the term dark patterns?

British UX researcher Harry Brignull coined the term in 2010 when he launched a website cataloguing examples of deceptive design and naming the most common types. His taxonomy -- roach motel, confirmshaming, trick questions, hidden costs, and others -- has since been adopted by regulators, academics and consumer protection organisations worldwide.

What is the difference between a nudge and a dark pattern?

Both use identical psychological mechanisms. The difference is whose interests they serve. A nudge helps people do what they genuinely want to do; a dark pattern steers them towards choices that benefit the designer at the user’s expense. The test: would the person thank you for the design if they understood it fully? If yes, nudge. If no, dark pattern.

Are dark patterns illegal?

Increasingly, yes. The EU’s Digital Services Act and GDPR both contain provisions that prohibit specific dark patterns, particularly in cookie consent flows and subscription cancellation. Regulators in the US, UK and EU have issued fines and enforcement actions. Google and Meta have paid hundreds of millions of euros in GDPR fines related to deceptive cookie consent design. The legal landscape is tightening rapidly.

What are the most common types of dark pattern?

The most common types are: roach motel (easy to sign up, nearly impossible to cancel), forced continuity (automatic subscription renewal without adequate notice), confirmshaming (opt-out buttons worded as self-insults), misdirection in cookie consent (Accept prominent, Reject buried), hidden costs (fees added at the last checkout step), and trick questions (confusingly worded opt-in checkboxes).

Conclusion

Dark patterns are not a niche UX problem. They are a systemic consequence of optimising for short-term metrics without accounting for long-term trust, retention or the genuine interests of the people being served. They are, in the most precise sense, behavioural design deployed against users rather than for them.

The same expertise that makes it possible to design dark patterns makes it possible to design their opposite: interfaces that genuinely serve users, build trust, create real loyalty and produce sustainable business outcomes. That is the direction I believe the discipline of behavioural design should move in -- and the direction that regulation is increasingly enforcing.

If you want to learn how to use behavioural science ethically and effectively in your own work, the Behavioural Design Fundamentals Course gives you a structured method for doing exactly that. Rated 9.7 out of 10 by more than 10,000 professionals from 45 countries.

PS

At SUE, our mission is to use the superpower of behavioural science to help people make better choices -- for themselves and for society. That mission has a necessary counterpart: a commitment not to use that knowledge to exploit the very cognitive vulnerabilities it describes. The same understanding that tells us defaults are powerful tells us that using them against users is wrong. If your organisation’s growth depends on people being confused, that is not a growth strategy. It is a liability waiting to surface.