Sound masking. Cycling music. AI automation. Built without dark patterns.Get in touch
Feb 22, 2026 · 9 min read

Dark patterns in apps: what they are and how to spot them

Dark patterns are design tricks that manipulate you into doing things you did not intend. Here is how to recognize them, why they persist, and what responsible alternatives look like.

Dark patterns are user interface designs that deliberately trick, manipulate, or coerce people into actions they did not intend to take. The term was coined by UX researcher Harry Brignull in 2010 to describe deceptive design practices that prioritize business goals over user interests. Fifteen years later, dark patterns are more prevalent than when they were first named.

If you have used a smartphone for any length of time, you have encountered dark patterns. You have been guilt-tripped out of cancelling a subscription. You have been shown a price that turned out to be higher at checkout. You have tapped a button you thought said one thing but did another. You have tried to find the "unsubscribe" option and given up because it was buried behind five layers of menus.

These are not accidents. They are intentional design choices, and understanding how they work is the first step toward not falling for them.

The most common types of dark patterns

Dark patterns come in many forms, but most fall into a handful of well-documented categories. Recognizing these categories makes them much easier to spot in the wild.

Trick questions

The interface uses confusing language, double negatives, or misleading phrasing to get you to select an option you did not mean to choose.

Example: A settings screen asks "Would you like to not opt out of receiving marketing emails?" If you are reading quickly (and most people are), you will get the answer backwards. That is the point. The confusion is the design.

How to spot it: Slow down whenever an app presents you with a question that requires you to read it twice. If the phrasing is confusing, assume the confusing option benefits the company, not you.

Forced continuity

A free trial automatically converts to a paid subscription without clear warning. The trial signup is easy and prominent. The conversion notification, if it exists, is buried in an email you probably did not read. The cancellation process is deliberately more difficult than the signup process.

Example: You sign up for a seven-day free trial with your credit card. On day eight, you are charged the full annual rate. The confirmation email was sent at 3 AM with a subject line that looked like a receipt for the free trial. Cancellation requires calling a phone number that operates during limited hours in a different time zone.

How to spot it: Before starting any free trial, search for the cancellation process first. If it is harder to find than the signup button, that imbalance is intentional.

Confirm-shaming

The option to decline is worded in a way that makes you feel foolish, irresponsible, or morally deficient for saying no.

Example: A popup offers a discount code. The "yes" button says "Yes, I want to save money!" The "no" button says "No thanks, I prefer paying full price." Variations include "No, I don't care about my health," "I'll stay uninformed," or "I don't want to improve."

How to spot it: Look at how the "no" option is worded. If declining makes you feel bad about yourself, the language is doing that on purpose.

Roach motel

The design makes it easy to get into a situation (signing up, subscribing, providing information) but deliberately difficult to get out. Named after the pest traps where insects enter easily and cannot leave.

Example: Creating an account takes thirty seconds and two taps. Deleting that account requires navigating to a settings page that is not in the settings menu, filling out a form explaining why you want to leave, waiting for an email confirmation, clicking a link in that email within 24 hours, and then waiting an additional 30-day "cooling off" period during which the account remains active and continues collecting data.

How to spot it: Before creating any account, search "[service name] delete account" or "[service name] cancel subscription." The difficulty of leaving tells you exactly how the company views your relationship with them.

Hidden costs

The price shown during the browsing and selection phase is not the price you pay at checkout. Additional fees, taxes, service charges, "processing fees," and surcharges appear only at the final step, after you have invested time and mental energy in the purchase process.

Example: A concert ticket shows as $45. At checkout, you discover a $12 service fee, a $5 facility charge, a $3 processing fee, and a $2 email delivery fee. The actual cost is $67, nearly 50% more than advertised. But by the time you see the real price, you have already spent ten minutes selecting seats and entering your information.

How to spot it: If the final price is significantly higher than what was initially displayed, the gap is a dark pattern, not an honest mistake.

Misdirection

The design draws your attention toward one option (the one the company wants you to choose) and away from another (the one you probably want). This is done through visual hierarchy: making the preferred option larger, more colourful, or more prominently placed while making the alternative option small, grey, or positioned where your eye does not naturally go.

Example: A subscription renewal screen shows a large, bright "Renew now" button in the centre of the screen. The "Cancel subscription" text is small, grey, and positioned in the bottom corner, styled to look like a footnote rather than a clickable option.

How to spot it: When an interface presents you with a choice, look for the option the design is trying to hide. If one option is visually prominent and another is deliberately de-emphasized, ask yourself whose interest that visual hierarchy serves.

Privacy zuckering

Named after Mark Zuckerberg, this pattern tricks users into sharing more personal information than they intended by making privacy settings confusing, defaulting to maximum data sharing, or bundling data consent into unrelated actions.

Example: An app asks for permission to access your contacts "to help you find friends." You grant the permission, and the app uploads your entire contact list to its servers, where it is stored indefinitely and used for targeted advertising and "people you may know" suggestions, not just for the friend-finding feature you consented to.

How to spot it: Whenever an app requests a permission, ask yourself whether the stated reason actually requires that level of access. A messaging app needs access to your camera to send photos. A flashlight app does not.

Why dark patterns persist

Dark patterns persist for one reason: they work. Companies that deploy guilt trips, forced continuity, and hidden costs consistently outperform (in the short term) companies that do not. A/B testing reliably shows that manipulative designs increase conversion rates, reduce cancellations, and generate more revenue per user.

The incentive structure is the problem. When a company's success is measured by engagement metrics, conversion rates, and average revenue per user, any design that increases those numbers gets adopted, regardless of whether it respects the user. Dark patterns are the predictable result of optimizing for business metrics without regard for the people using them.

Regulation is catching up, slowly. The European Union's Digital Services Act includes provisions against dark patterns. The US Federal Trade Commission has brought enforcement actions against companies using deceptive design. Canada's proposed Consumer Privacy Protection Act (which has not yet become law) would have included dark pattern restrictions. But regulatory enforcement moves much slower than product design, and many dark patterns exist in a grey area that is technically legal but clearly manipulative.

How to protect yourself

Beyond recognizing specific dark pattern types, there are general habits that reduce your vulnerability:

Search for the exit before you enter. Before signing up for any service, search for how to cancel or delete your account. The difficulty of leaving is one of the clearest indicators of how a company views its users.

Read the decline option first. When presented with a popup, modal, or choice screen, read the "no" or "skip" option before reading the "yes" option. If the decline option is shaming, hidden, or confusing, the choice architecture is designed against your interests.

Check the total before the last step. Hidden cost patterns rely on you being too invested to back out at checkout. Make a habit of checking final prices early and abandoning purchases where the price inflated significantly.

Review permissions critically. When an app requests a device permission, consider whether the feature you want actually requires that access. If the connection between the permission and the feature is unclear, deny the permission and see if the app works without it.

Use app store privacy labels. Both Apple's App Store and Google Play now require developers to disclose data collection practices. Read these labels before downloading. If an app collects data categories that seem unrelated to its function, that is a signal worth paying attention to.

What responsible design looks like instead

Dark patterns are not the only way to build a profitable app. They are just the easiest way. Responsible design requires more thought, more restraint, and a genuine belief that you can build a sustainable business without manipulating the people who use your products.

At siasola, our products are designed around the opposite of every dark pattern described above.

Cancellation is the same difficulty as subscription. In Siasola Tinnitus Masking Sounds and Siasola Cycling Beats, you subscribe with a tap, and you cancel with a tap. No phone calls. No retention flows. No waiting periods.

No guilt, no shame, no pressure. If you use the free tier, that is fine. If you choose not to subscribe, the interface does not editorialize about your decision. Your choice is respected, period.

The price is the price. Our subscription pricing is displayed clearly, with no hidden fees, processing charges, or surprise surcharges at checkout. What you see is what you pay.

No unnecessary permissions. Our apps request only the permissions required for core functionality. We do not ask for access to your contacts, your location, your microphone, or your camera unless a specific feature you initiated requires it.

Privacy settings are on by default. You do not need to navigate a maze of settings to protect your information. The default state of every siasola product is private.

This approach is not a competitive disadvantage. Users who feel respected by a product stay longer, recommend it more often, and cost less to support. Responsible design is not just the right thing to do. It is also good business.

The bigger picture

Dark patterns are a symptom of a deeper problem in the technology industry: the widespread belief that user manipulation is an acceptable business practice as long as it is legal and profitable. Changing this requires both informed consumers who can recognize and reject dark patterns, and companies willing to prove that respectful design is financially sustainable.

Every time you choose an app that does not manipulate you over one that does, you are sending a market signal. You are proving that there is demand for technology built on respect rather than exploitation.

That is the signal siasola is building for. And we think more companies will follow, not because manipulation stops working, but because enough users start choosing something better.


Read more about how siasola approaches responsible design: What 'Built for Good' Means at Siasola and Why We Don't Sell Your Data (And Never Will). Explore our products: Tinnitus Masking Sounds, Cycling Beats, and AI automation services.

Justin, founder of siasola

Justin

Founder of siasola

BSc Computer Science, graduate studies in machine learning / AI, 12 years of music training. Building AI automation and apps for good.