Practice
Architecture and Hardware Practice

Dark Patterns: Past, Present, and Future

The evolution of tricky user interfaces.
Posted
  1. Introduction
  2. Deception and Manipulation in Retail
  3. The Origins of Nudging
  4. Nudging: The Turn to Paternalism
  5. Growth Hacking
  6. From Growth Hacking to Dark Patterns
  7. Money, Data, Attention
  8. Dark Patterns Are Here to Stay
  9. Recommendations for Designers
  10. References
  11. Authors
user's hands at mouse and keyboard

back to top 

Dark patterns are user interfaces that benefit an online service by leading users into making decisions they might not otherwise make. Some dark patterns deceive users while others covertly manipulate or coerce them into choices that are not in their best interests. A few egregious examples have led to public backlash recently: TurboTax hid its U.S. government-mandated free tax-file program for low-income users on its website to get them to use its paid program;9 Facebook asked users to enter phone numbers for two-factor authentication but then used those numbers to serve targeted ads;31 Match.com knowingly let scammers generate fake messages of interest in its online dating app to get users to sign up for its paid service.13 Many dark patterns have been adopted on a large scale across the Web. Figure 1 shows a deceptive countdown timer dark pattern on JustFab. The advertised offer remains valid even after the timer expires. This pattern is a common tactic—a recent study found such deceptive countdown timers on 140 shopping websites.20

f1.jpg
Figure 1. A deceptive countdown timer on JustFab.

The research community has taken note. Recent efforts have catalogued dozens of problematic patterns such as nagging the user, obstructing the flow of a task, and setting privacy-intrusive defaults,1,18 building on an early effort by Harry Brignull (darkpatterns.org). Researchers have also explained how dark patterns operate by exploiting cognitive biases4,20,33 uncovered dark patterns on more than 1,200 shopping websites,20 shown that more than 95% of the popular Android apps contain dark patterns,8 and provided preliminary evidence that dark patterns are indeed effective at manipulating user behavior.19,30

Although they have recently burst into mainstream awareness, dark patterns are the result of three decades-long trends: one from the world of retail (deceptive practices), one from research and public policy (nudging), and the third from the design community (growth hacking).

Figure 2 illustrates how dark patterns stand at the confluence of these three trends. Understanding these trends—and how they have collided into each other—is essential to help us appreciate what is actually new about dark patterns, demystifies their surprising effectiveness, and shows us why it will be difficult to combat them. We end this article with recommendations for ethically minded designers.

f2.jpg
Figure 2. The origins of dark patterns.

Back to Top

Deception and Manipulation in Retail

The retail industry has a long history of deceptive and manipulative practices that range on a spectrum from normalized to unlawful (Figure 3). Some of these techniques, such as psychological pricing (that is, making the price slightly less than a round number), have become normalized. This is perfectly legal, and consumers have begrudgingly accepted it. Nonetheless, it remains effective: consumers underestimate prices when relying on memory if psychological pricing is employed.3

f3.jpg
Figure 3. Examples of deceptive and manipulative retail practices.

More problematic are practices such as false claims of store closings, which are unlawful but rarely the target of enforcement actions. At the other extreme are bait-and-switch car ads such as the one by a Ford dealership in Cleveland that was the target of an FTC action.14

Back to Top

The Origins of Nudging

In the 1970s, the heuristics and biases literature in behavioral economics sought to understand irrational decisions and behaviors—for example, people who decide to drive because they perceive air travel as dangerous, even though driving is, in fact, orders of magnitude more dangerous per mile.29 Researchers uncovered a set of cognitive shortcuts used by people that make these irrational behaviors not just explainable but even predictable.

For example, in one experiment, researchers asked participants to write down an essentially random two-digit number (the last two digits of each participant’s social security number), then asked if they would pay that number of dollars for a bottle of wine, and finally asked the participants to state the maximum amount they would pay for the bottle.2 They found the willingness to pay varied by approximately threefold based on the arbitrary number. This is the anchoring effect: lacking knowledge of the market value of the bottle of wine, participants’ estimates become anchored to the arbitrary reference point. This study makes it easy to see how businesses might be able to nudge customers to pay higher prices by anchoring their expectations to a high number. In general, however, research on psychological biases has not been driven by applications in retail or marketing. That would come later.

Back to Top

Nudging: The Turn to Paternalism

The early behavioral research on this topic focused on understanding rather than intervention. Some scholars, such as Cass Sunstein and Richard Thaler, authors of the book Nudge,28 went further to make a policy argument: Governments, employers, and other benevolent institutions should engineer “choice architectures” in a way that uses behavioral science for the benefit of those whom they serve or employ.

A famous example (Figure 4) is the striking difference in organ-donation consent rates between countries where people have to explicitly provide consent (red bars) versus those where consent is presumed (orange bars). Because most people tend not to change the default option, the latter leads to significantly higher consent rates.17

f4.jpg
Figure 4. Organ-donation consent rates by countries.

Today, nudging has been enthusiastically adopted by not only governments and employers, but also businesses in the way they interact with their customers. The towel reuse message you may have seen in hotel rooms (“75% of guests in this hotel usually use their towels more than once”) is effective because it employs descriptive social norms as a prescriptive rule to get people to change their behavior.16

With the benefit of hindsight, neither the proponents nor the critics of nudging anticipated how readily and vigorously businesses would adopt these techniques in adversarial rather than paternalistic ways. In Nudge, Sunstein and Thaler briefly address the question of how to tell if a nudge is ethical, but the discussion is perfunctory. The authors seem genuinely surprised by recent developments and have distanced themselves from dark patterns, which they label “sludges.”27

Back to Top

Growth Hacking

The third trend—and the one that most directly evolved into dark patterns—is growth hacking. The best-known and arguably the earliest growth hack was implemented by Hotmail. When it launched in 1996, the founders first considered traditional marketing methods such as billboard advertising. Instead, they hit upon a viral marketing strategy: The service automatically added the signature, “Get your free email with Hotmail,” to every outgoing email, essentially getting users to advertise on its behalf, resulting in viral growth.21

Successes like these led to the emergence of growth hacking as a distinct community. Growth hackers are trained in design, programming, and marketing and use these skills to drive product adoption.

Growth hacking is not inherently deceptive or manipulative but often is in practice. For example, in two-sided markets such as vacation rentals, upstarts inevitably face a chicken-and-egg problem: no travelers without hosts and no hosts without travelers. So it became a common practice to “seed” such services with listings that were either fake or scraped from a competitor.22,23

Unsurprisingly, growth hacking has sometimes led to legal trouble. A hugely popular growth hack involved obtaining access to users’ contact books—often using deception—and then spamming those contacts with invitations to try a service. The invitations might themselves be deceptive by appearing to originate from the user, when in fact users were unaware of the emails being sent. LinkedIn settled a class action for exactly this practice, which it used from 2011 to 2014.25

Back to Top

From Growth Hacking to Dark Patterns

But why growth rather than revenue or some other goal? It is a reflection of Silicon Valley’s growth-first mantra in which revenue-generating activities are put aside until after-market dominance has been achieved. Of course, eventually every service runs into limits on growth, because of either saturation or competition, so growth hackers began to adapt their often-manipulative techniques to extracting and maximizing revenue from existing users.

In developing their battery of psychological tricks, growth hackers had two weapons that were not traditionally available in offline retail. The first was that the nudge movement had helped uncover the principles of behavior change. In contrast, the marketing literature that directly studied the impact of psychological tricks on sales was relatively limited because it didn’t get at the foundational principles and was limited to the domain of retail.

The second weapon was A/B testing (Figure 5). By serving variants of Web pages to two or more randomly selected subsets of users, designers began to discover that even seemingly trivial changes to design elements can result in substantial differences in behavior. The idea of data-driven optimization of user interfaces has become deeply ingrained in the design process of many companies. For large online services with millions of users, it is typical to have dozens of A/B tests running in parallel, as noted in 2009 by Douglas Bowman, once a top visual designer at Google:

f5.jpg
Figure 5. Hypothetical illustration of A/B testing on a website.

Yes, it’s true that a team at Google couldn’t decide between two blues, so they’re testing 41 shades between each blue to see which one performs better. I had a recent debate over whether a border should be 3, 4, or 5 pixels wide, and was asked to prove my case. I can’t operate in an environment like that. I’ve grown tired of debating such minuscule design decisions. There are more exciting design problems in this world to tackle. —Douglas Bowman

A/B testing proved key to the development of dark patterns because it is far from obvious how to translate an abstract principle like social proof into a concrete nudge (“7 people are looking at this hotel right now!”). Another example: For how long should a fake countdown timer be set (“This deal expires in 15 minutes!” … “14:59” … “14:58” …), so the user acts with urgency but not panic? Online experiments allow designers to find the answers with just a few lines of code.

Back to Top

Money, Data, Attention

Let’s recap. As the online economy matured, services turned their attention from growth to revenue. They used the principles of behavioral influence but subverted the intent of the researchers who discovered those principles by using them in ways that undermined consumers’ autonomy and informed choice. They used A/B testing to turn behavioral insights into strikingly effective user interfaces. In some cases these were optimized versions of tricks that have long been used in retail, but in other cases they were entirely new.

How, exactly, do dark patterns help maximize a company’s ability to extract revenue from its users? The most obvious way is simply to nudge (or trick) consumers into spending more than they otherwise would.

A less obvious, yet equally pervasive, goal of dark patterns is to invade privacy. For example, cookie consent dialogs almost universally employ manipulative design to increase the likelihood of users consenting to tracking. In fact, a recent paper shows that when asked to opt in, well under 1% of users would provide informed consent.30 Regulations such as the GDPR (General Data Protection Regulation) require companies to get explicit consent for tracking, which poses an existential threat to many companies in the online tracking and advertising industry. In response, they appear to be turning to the wholesale use of dark patterns.30

A third goal of dark patterns is to make services addictive. This goal supports the other two, as users who stay on an app longer will buy more, yield more personal information, and see more ads. Apps like Uber use gamified nudges to keep drivers on the road longer (Figure 6). The needle suggests the driver is extremely close to the goal, but it is an arbitrary goal set by Uber when a driver wants to go offline.24 To summarize, dark patterns enable designers to extract three main resources from users: money, data, and attention.

f6.jpg
Figure 6. One of Uber’s gamified nudges to keep drivers on the road.

Back to Top

Dark Patterns Are Here to Stay

Two years ago, few people had heard the term dark patterns. Now it’s everywhere. Does this mean dark patterns are a flash in the pan? Perhaps, as users figure out what’s going on, companies will realize that dark patterns are counterproductive and stop using them. The market could correct itself.

The history sketched here suggests that this optimistic view is unlikely. The antecedents of dark patterns are decades old. While public awareness of dark patterns is relatively new, the phenomenon itself has developed gradually. In fact, the darkpatterns.org website was established in 2010.

The history also helps explain what is new about dark patterns. It isn’t just tricky design or deceptive retail practices online. Rather, design has been weaponized using behavioral research to serve the aims of the surveillance economy. This broader context is important. It helps explain why the situation is as bad as it is and suggests that things will get worse before they can get better.

One worrying trend is the emergence of companies that offer dark patterns as a service, enabling websites to adopt them with a few lines of JavaScript.20 Another possible turn for the worse is personalized dark patterns that push each user’s specific buttons.26 This has long been predicted5 but remains rare today (manipulative targeted advertising can arguably be viewed as a dark pattern, but ads are not user interfaces). The absence of personalized UI is presumably because companies are busy picking lower-hanging fruit, but this can change any time.

Back to Top

Recommendations for Designers

Designers should be concerned about the proliferation of dark patterns. They are unethical and reflect badly on the profession. But this article is not a doom-and-gloom story. There are steps you can take, both to hold yourself and your organization to a higher standard, and to push back against the pressure to deploy dark patterns in the industry.

Go beyond superficial A/B testing metrics. Earlier we discussed how designers use A/B tests to optimize dark patterns. But there’s a twist: a design process hyperfocused on A/B testing can result in dark patterns even if that is not the intent. That’s because most A/B tests are based on metrics that are relevant to the company’s bottom line, even if they result in harm to users. As a trivial example, an A/B test might reveal that reducing the size of a “Sponsored” label that identifies a search result as an advertisement causes an increase in the CTR (click-through rate). While a metric such as CTR can be measured instantaneously, it reveals nothing about the long-term effects of the design change. It is possible that users lose trust in the system over time when they realize they are being manipulated into clicking on ads.

In a real example similar to this hypothetical one, Google recently changed its ad labels in a way that made it difficult for users to distinguish ads from organic search results, and presumably increased CTR for ads (Figure 7). A backlash ensued, however, and Google rolled back this interface.32

f7.jpg
Figure 7. Google’s recent change to its ad labels.

To avoid falling into this trap, evaluate A/B tests on at least one metric that measures long-term impacts. In addition to measuring the CTR, you could also measure user retention. That will tell you if a different-sized label results in more users abandoning the website.

Still, many attributes that matter in the long term, such as trust, are not straightforward to observe and measure, especially in the online context. Think critically about the designs you choose to test, and when you find that a certain design performs better, try to understand why.

While the overreliance on A/B testing is a critical issue to be addressed, let’s next turn to a much broader and longer-term concern.

Incorporate ethics into the design process. While dark patterns are a highly visible consequence of the ethical crisis in design, resolving the crisis entails far more than avoiding a simple list of patterns. It requires structural changes to the design process.

Start by articulating the values that matter to you and that will guide your design.15 Not every organization will have an identical set of values, but these values must be broadly aligned with what society considers important.

In fact, much of the present crisis can be traced to a misalignment of values between society and companies. Autonomy and privacy are two values where this is particularly stark. Consider frictionless design, a bedrock value in the tech industry. Unfortunately, it robs users of precisely those moments that may give them opportunities for reflection and enable them to reject their baser impulses. Frictionlessness is antithetical to autonomy. Similarly, designing for pleasure and fun is a common design value, but when does fun cross the line into addiction?

Once you have articulated your values, continue to debate them internally. Publicize them externally, seek input from users, and, most importantly, hold yourself accountable to them. Effective accountability is challenging, however. For example, advisory boards established by technology companies have been criticized for not being sufficiently independent.

Everyday design decisions should be guided by referring to established values. In many cases it is intuitively obvious whether a design choice does or does not conform to a design value, but this is not always so. Fortunately, research has revealed a lot about the factors that make a design pattern dark, such as exploiting known cognitive biases and withholding crucial information.4,20 Stay abreast of this research, evaluate the impact of design on your users, and engage in critical debate about where to draw the line based on the company’s values and your own sense of ethics. Rolling back a change should always be an option if it turns out that it didn’t live up to your values.

As you gain experience making these decisions in a particular context, higher-level principles can be codified into design guidelines. There is a long tradition of usability guidelines in the design community. There are also privacy- by-design guidelines, but they are not yet widely adopted.10 There is relatively little in the way of guidelines for respecting user autonomy.

All of this is beyond the scope of what individual designers can usually accomplish; the responsibility for incorporating ethics into the design process rests with organizations. As an individual, you can start by raising awareness within your organization.

Self-regulate or get regulated. Dark patterns are an abuse of the tremendous power that designers hold in their hands. As public awareness of dark patterns grows, so does the potential fallout. Journalists and academics have been scrutinizing dark patterns, and the backlash from these exposés can destroy brand reputations and bring companies under the lenses of regulators.

Many dark patterns are already unlawful. In the U.S., the Federal Trade Commission (FTC) Act prohibits “unfair or deceptive” commercial practices.11 In a recent example, the FTC reached a settlement with Unroll. Me—a service that unsubscribed users’ email addresses from newsletters and subscriptions—because it was in fact selling information it read from their inboxes to third parties.12 European Union authorities have tended to be stricter: French regulator CNIL (Commission Nationale de l’Informatique et des Libertés) fined Google 50 million euros for hiding important information about privacy and ad personalization behind five to six screens.6

There is also a growing sense that existing regulation is not enough, and new legislative proposals aim to curb dark patterns.7 While policymakers should act—whether by introducing new laws or by broadening and strengthening the enforcement of existing ones—relying on regulation is not sufficient and comes with compliance burdens.

Let’s urge the design community to set standards for itself, both to avoid onerous regulation and because it’s the right thing to do. A first step would be to rectify the misalignment of values between the industry and society, and develop guidelines for ethical design. It may also be valuable to partner with neutral third-party consumer advocacy agencies to develop processes to certify apps that are free of known dark patterns. Self-regulation also requires cultural change. When hiring designers, ask about the ethics of their past work. Similarly, when deciding between jobs, use design ethics as one criterion for evaluating a company and the quality of its work environment.

Design is power. In the past decade, software engineers have had to confront the fact that the power they hold comes with responsibilities to users and to society. In this decade, it is time for designers to learn this lesson as well.

q stamp of ACM Queue Related articles
on queue.acm.org

User Interface Designers, Slaves of Fashion
Jef Raskin
https://queue.acm.org/detail.cfm?id=945161

The Case Against Data Lock-in
Brian W. Fitzpatrick and J.J. Lueck
https://queue.acm.org/detail.cfm?id=1868432

Bitcoin’s Academic Pedigree
Arvind Narayanan and Jeremy Clark
https://queue.acm.org/detail.cfm?id=3136559

    1. Acquisti, A. et al., Wilson, S. Nudges for privacy and security: understanding and assisting users' choices online. ACM Computing Surveys 50, 3 (2017), 1–41; https://dl.acm.org/doi/10.1145/3054926.

    2. Ariely, D. Predictably Irrational. Harper Audio, New York, NY, 2008.

    3. Bizer, G.Y. and Schindler, R.M. Direct evidence of ending-digit drop-off in price information processing. Psychology & Marketing 22, 10 (2005), 771–783.

    4. Bösch, C., Erb, B., Kargl, F., Kopp, H. and Pfattheicher, S. Tales from the dark side: Privacy dark strategies and privacy dark patterns. In Proceedings on Privacy Enhancing Technologies 4, (2016), 237–254.

    5. Calo, R. Digital market manipulation. George Washington Law Review 82, 4 (2014), 995–1051; http://www.gwlr.org/wp-content/uploads/2014/10/Calo_82_41.pdf.

    6. Commission Nationale de l'Informatique et des Libertés. The CNIL's restricted committee imposes a financial penalty of 50 million euros against Google LLC, 2019; https://bit.ly/3dtTcoS.

    7. Fischer, D. United States Senator for Nebraska. Senators introduce bipartisan legislation to ban manipulative dark patterns, 2019; https://bit.ly/3eQO4LT.

    8. Di Geronimo, L., Braz, L., Fregnan, E., Palomba F. and Bachelli, A. UI dark patterns and where to find them: a study on mobile applications and user perception. In Proceedings of the 2020 ACM Conference on Human Factors in Computing Systems.

    9. Elliott, J. and Waldron, L. Here's how TurboTax just tricked you into paying to file your taxes. ProPublica (April 22, 2019); https://www.propublica.org/article/turbotax-just-tricked-you-into-paying-to-file-your-taxes.

    10. European Data Protection Board. Guidelines 4/2019 on Article 25, Data Protection by Design and by Default; https://bit.ly/3710o9s.

    11. Federal Trade Commission. A brief overview of the Federal Trade Commission's investigative, law enforcement, and rulemaking authority, 2019; https://www.ftc.gov/about-ftc/what-we-do/enforcement-authority.

    12. Federal Trade Commission. FTC finalizes settlement with company that misled consumers about how it accesses and uses their email, 2019; https://bit.ly/3h0l7PF.

    13. Federal Trade Commission. FTC sues owner of online dating service Match.com for using fake love interest ads to trick consumers into paying for a Match. com subscription. Sept. 25, 2019; https://bit.ly/3dArhUb.

    14. Federal Trade Commission. Ganley Ford, 2013; https://bit.ly/2XxHwft.

    15. Friedman, B., Kahn, P. H., Borning, A. and Huldtgren, A. Value-sensitive design and information systems. Early Engagement and New Technologies: Opening Up the Laboratory. N. Doorn, D. Schuurbiers, I. van de Poel, M.E. Gorman, Eds. Springer, Dordrecht, Germany, 2013, 55–95; https://link.springer.com/book/10.1007/978-94-007-7844-3.

    16. Goldstein, N.J., Cialdini, R.B. and Griskevicius, V. A room with a viewpoint: using social norms to motivate environmental conservation in hotels. J. Consumer Research 35, 3 (2008), 472–482.

    17. Goldstein, D. and Johnson, E.J. Do defaults save lives? Science 302, 5649 (2003), 1338–1339; https://science.sciencemag.org/content/302/5649/1338.

    18. Gray, C.M., Kou, Y., Battles, B., Hoggatt, J., Toombs, A.L. The dark (patterns) side of UX design. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (April), 1–14; https://dl.acm.org/doi/10.1145/3173574.3174108.

    19. Luguri, J. and Strahilevitz, L. Shining a light on dark patterns. University of Chicago, Public Law Working Paper. No. 719, 2019.

    20. Mathur, A., Acar, G., Friedman, M.J., Lucherini, E., Mayer, J., Chetty, M., Narayanan, A. Dark patterns at scale: findings from a crawl of 11K shopping websites. In Proceedings of the ACM on Human-Computer Interaction 3 (2019), 1–32; https://dl.acm.org/doi/10.1145/3359183.

    21. McLaughlin, J. 9 iconic growth hacks tech companies used to boost their user bases. The Next Web, 2014; https://bit.ly/2MtX0L1.

    22. Mead, D. How Reddit got huge: tons of fake accounts. Vice, 2012; https://www.vice.com/en_us/article/z4444w/how-reddit-got-huge-tons-of-fake-accounts--2.

    23. Rosoff, M. Airbnb farmed Craigslist to grow its listings, says competitor. Business Insider, 2011; https://bit.ly/2Mv23L7

    24. Scheiber, N. How Uber uses psychological tricks to push its drivers' buttons. New York Times (Apr. 22, 2017); https://nyti.ms/3h3RuNk

    25. Strange, A. LinkedIn pays big after class action lawsuit over user emails. Mashable, 2015; https://mashable.com/2015/10/03/linkedin-class-action.

    26. Susser, D., Roessler, B. and Nissenbaum, H. Online manipulation: hidden influences in a digital world. Georgetown Law Technology Review 4.1 (2019), 1–45; https://philarchive.org/archive/SUSOMHv1.

    27. Thaler, R.H. Nudge, not sludge. Science 361 (2018), 431–431.

    28. Thaler, R.H., Sunstein, C.R. Nudge: Improving Decisions About Health, Wealth, and Happiness. Penguin Books, 2009.

    29. Tversky, A. and Kahneman, D. Judgment under uncertainty: Heuristics and biases. Science 185, 4157 (1974), 1124–1131.

    30. Utz, C., Degeling, M., Fahl, S., Schaub, F., Holz, T. (Un) informed consent: Studying GDPR consent notices in the field. In Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security, 2019, 973–990; https://dl.acm.org/doi/10.1145/3319535.3354212.

    31. Venkatadri, G., Lucherini, E., Sapiezynski, P. and Mislove, A. Investigating sources of PII used in Facebook's targeted advertising. In Proceedings on Privacy Enhancing Technologies 1 (2019), 227–244; https://content.sciendo.com/view/journals/popets/2019/1/article-p227.xml?lang=en.

    32. Wakabayashi, D. and Hsu, T. Why Google backtracked on its new search results look. New York Times (Jan. 31, 2020); https://nyti.ms/2XYvg6I.

    33. Waldman, A.E. Cognitive biases, dark patterns, and the 'privacy paradox.' SSRN, 2019.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More