Dark patterns in UX design: How users are manipulated

Digital products are meant to support people, provide orientation and make decisions easier. To achieve this, designers rely on User Experience (UX) design. Good UX design creates digital interfaces that are transparent, understandable and fair. However, not every digital product follows this principle. Many applications contain deliberately implemented design patterns intended to push users towards certain actions – actions they might not agree to under different circumstances. These patterns are known as “dark patterns”. In this article, we explain the most common types of dark patterns, how they can be recognised in digital products, and what designers should consider when creating their own interfaces in order to avoid unintended manipulation.

March 12, 2026

Mechanisms
Close
Mechanisms

Dark UX patterns rarely appear as obvious deception. Instead, they usually take the form of seemingly harmless design decisions that subtly steer behaviour. In practice, they appear in recurring forms – from complicated cancellation processes and artificial scarcity to manipulative wording or hidden costs. The following examples illustrate how subtle these mechanisms can be and how strongly they can influence everyday digital decisions.

1) Roach Motel

A classic example is the so-called Roach Motel pattern. Signing up for a service is quick and simple, often requiring only a single click. Cancelling the service, however, takes considerable effort as the option may be hidden in complex menus or require contacting a hotline. Users can easily enter the system but struggle to leave it. A similar mechanism can be found in hidden charges during the checkout process. What initially appears to be an inexpensive offer becomes more expensive in the final step through fees that were not previously visible.

2) Fake urgency

In e-commerce, fake scarcity is often used to accelerate purchasing decisions. Techniques such as countdown timers, flash sales or notifications about other interested buyers create fear of missing out and increase the perceived value of a product. The problem is that these indicators are rarely true. Instead, they are made-up signals designed to put users under time pressure and encourage purchases.

Beispiel für Künstliche Verknappung – ein Dark UX Pattern
Left: Fake scarcity in an online shop, Right: Time pressure with discounts. Source: https://www.deceptive.design/types
Left: Fake scarcity in an online shop, Right: Time pressure with discounts. Source: https://www.deceptive.design/types

 

Want advice on dark patterns?

Book a non-binding consultation now – but hurry, you only have 3 days left!! 5 people have already booked today – only 2 slots remain!!

Of course, that is not actually the case. If you are interested in learning more about dark patterns and how to avoid them, just book a free consultation with us whenever it suits you.

 


 

3) Hidden costs

Hidden costs are among the most common dark patterns in e-commerce. A product is initially presented at a particularly attractive price, while additional fees only appear in the final stage of the ordering process. Shipping costs, service charges or automatically selected add-ons may only become visible shortly before completion. At this point, users have already invested time, entered their details and mentally committed to the purchase. Many therefore continue with the transaction even though the final price is significantly higher than expected. This pattern exploits the sunk cost effect and leads to decisions that might be different under transparent conditions.

A related pattern is “basket sneaking”, where products are added to the shopping cart without the user’s explicit consent and are sometimes difficult to remove.

4) Confirmshaming

Dark patterns can also rely on emotional pressure. In confirmshaming, users are made to feel guilty if they decline an option. Instead of a neutral choice, users may see wording such as, “No, I do not want to save money.” or, “I prefer to stay uninformed.” These messages are not designed to inform but to influence.

5) Trick questions

Another common pattern involves confusing or double-negative wording. Checkboxes such as, “I do not wish to not receive emails” – especially when the box must be checked to decline – force users to spend additional time thinking about the meaning or to accept the default setting out of uncertainty. Another example are questions like, “Would you like to decline your free offer?” The preferred answer to such confusing questions is often visually highlighted. This falls under another dark pattern known as interface interference. It becomes particularly problematic when button labels deliberately do not match the actions behind them, resulting in users granting consent without fully understanding what they are agreeing to.

6) Forced continuity

Forced continuity describes a pattern in which an initially free or discounted trial period automatically turns into a paid subscription – often without a clear reminder or with a deliberately complicated cancellation process. While joining the service may take only a few clicks, leaving it can be complex, time-consuming or unclear. Users rely on being informed in time or being able to cancel easily. This trust is then exploited. The result is unwanted contract extensions that arise not from an active decision but from inertia or lack of transparency.

Why do these mechanisms work?

Dark patterns deliberately target psychological processes that influence human decision-making. People rarely act entirely rationally. Instead, they rely on mental shortcuts known as heuristics. Time pressure encourages impulsive decisions. Social signals such as “Others also bought this product” activate herd behaviour. Artificial scarcity increases the perceived value of an offer. Dark patterns exploit several cognitive biases:

  • Default bias: People tend to stick with pre-selected options. If seven cookies are pre-selected, users are more likely to accept them than to deselect each one individually.
  • Inertia: The easier path tends to win. If accepting something requires one click but rejecting it requires five, most users will not make the extra effort.
  • Loss aversion: Messages such as “You may lose certain features” create anxiety and encourage users to agree.

These tactics do not persuade users of the value of a decision – they simply push them to comply.

Why companies still use dark patterns
Close
Why companies still use dark patterns

Despite the obvious problems, some companies continue to rely on dark patterns. This is often not due to individual designers but to structural factors. When team performance is measured primarily through short-term metrics such as conversion rates or registration numbers, pressure arises to deliver immediate results. Manipulative patterns can help achieve this. Long-term effects such as declining trust or a damaged brand reputation are often underestimated or only become visible later.

The consequences: trust is easier lost than gained

Users eventually realise when they have been manipulated. Perhaps not immediately, but certainly when they receive an unexpected invoice or struggle through a complicated process. The result is frustration, declining trust and, in the worst case, a permanent loss of customer loyalty. Negative experiences also spread quickly through reviews, social media and personal recommendations.

New risks through AI-supported design

With the increasing use of generative AI in UX design, the dynamics of interface creation are changing. AI-based tools can generate complete user interfaces based on only a few prompts – often optimised for metrics such as conversion or engagement. If such systems are trained on existing performance-driven designs, there is a risk that manipulative patterns will not only be reproduced but scaled. At the same time, a responsibility gap can emerge: decisions are increasingly prepared algorithmically, while the ethical evaluation of those suggestions may not always be considered. AI is therefore not a neutral design tool. Depending on its objectives, it can either accelerate dark patterns or help create more transparent and fair interfaces.

Ethical UX instead of manipulative tricks
Close
Ethical UX instead of manipulative tricks

Ethical UX, sometimes referred to as trust-centred design, describes an approach in which not only economic goals but also the interests, rights and well-being of users are placed at the centre of the design process. Rather than forcing decisions through pressure, deception or artificial scarcity, this approach emphasises transparency, genuine choice and clear communication. Users should be able to make informed and voluntary decisions without manipulation. This does not mean abandoning business objectives. Instead, it means achieving them through fair and understandable user journeys. Decisions should be clearly formulated, costs communicated transparently and exiting a service should be just as easy as joining it. Privacy options should not be hidden but openly presented. The goal is to build long-term trust and stable customer relationships instead of improving short-term metrics through manipulative design patterns.

Left: Checkout module with dark patterns, Right: Checkout module without dark patterns
Left: Checkout module with dark patterns, Right: Checkout module without dark patterns

Ethical perspectives in academic literature

Academic literature evaluates dark patterns from several ethical perspectives, with particular focus on the autonomy of users.
Deontological approaches argue that manipulative design restricts freedom of choice and reduces individuals to mere instruments for economic goals. The key issue is not the outcome but the nature of the influence.
Consequentialist approaches, such as utilitarianism, assess dark patterns based on their effects. Short-term economic benefits are weighed against long-term harm such as loss of trust or market distortion.
Responsibility-based ethics focus on the actors who possess design power – designers, companies and regulators – and emphasise the need to consider the long-term societal consequences of digital decision architectures.
The following illustration clarifies this connection using the example of user autonomy and shows how dark patterns can impair key dimensions such as independence, control and capacity to act.

Source: Dark Patterns and User Autonomy - An Overview of Ethical Considerations (Ahuja & Kumar, 2022, p.13)
Source: Dark Patterns and User Autonomy - An Overview of Ethical Considerations (Ahuja & Kumar, 2022, p.13)
Regulatory approaches
Close
Regulatory approaches

Within the European Union, dark patterns are increasingly addressed through regulation. One important instrument is the Digital Services Act (DSA), which explicitly prohibits online platforms from designing or operating their interfaces in ways that deceive or manipulate users or significantly impair their ability to make free and informed decisions (Article 25).

The General Data Protection Regulation (GDPR) also applies, as consent – for example for tracking or cookies – must be freely given, informed and unambiguous. The European Data Protection Board (EDPB) has published guidelines explaining how deceptive design patterns can be identified and avoided. Consumer protection law also plays a role. The Unfair Commercial Practices Directive (UCPD) targets misleading or aggressive business practices in the B2C sector, even though the term “dark patterns” is not explicitly used.

In practice, the topic is gaining importance as authorities and consumer organisations increasingly take action against manipulative design mechanisms such as artificial scarcity, pressure tactics or complicated opt-out processes.

Conclusion and self-check
Close
Conclusion and self-check

Dark patterns are not a sign of particularly creative design but rather a symptom of short-term thinking. While they may temporarily increase certain metrics, in the long run they damage trust, brand value and legal stability. A truly good user experience – and the sustainable success of digital products – is built not on manipulation but on transparency, fairness and a genuine focus on users’ needs. Companies that understand this do not only create better interfaces but also more reliable relationships with their customers.

A short self-check for your digital service or product

Companies can critically review their UX with a few simple questions:

  • Are all important options equally visible and easy to choose, or is one option visually favoured?
  • Is leaving the service (e.g. cancelling or opting out) just as easy as joining?
  • Are all costs, conditions and durations communicated early and transparently?
  • Is artificial time pressure or questionable scarcity being used?
  • Would a user make the same decision if all information were presented clearly and neutrally?

If several of these questions raise doubts, a closer look is worthwhile. Dark patterns are often not created intentionally but emerge from optimisation pressure and routine practices. For this reason, regular reflection is an important step towards fair and responsible UX.

So you now know about dark patterns – what next?

Dark patterns appear more frequently than one might expect. However, something can be done about them. Both designers and providers of digital products can ensure that interfaces are implemented in an ethically responsible way. Even people who do not work directly with digital products can raise awareness in their environment and help spread knowledge so that fewer users are exposed to manipulative design.

Would you like to review your digital product for dark patterns or receive expert advice on the topic? Contact us without obligation or book a free initial consultation. We look forward to getting to know you and your product.

Cookie Manager
Back to top Arrow