Dark patterns are user interfaces that benefit an online service by leading users into making decisions they might not otherwise make. Some dark patterns deceive users while others covertly manipulate or coerce them into choices that are not in their best interests. A few egregious examples have led to public backlash recently: TurboTax hid its U.S. government-mandated free tax-file program for low-income users on its website to get them to use its paid program;9 Facebook asked users to enter phone numbers for two-factor authentication but then used those numbers to serve targeted ads;31 Match.com knowingly let scammers generate fake messages of interest in its online dating app to get users to sign up for its paid service.13 Many dark patterns have been adopted on a large scale across the Web. Figure 1 shows a deceptive countdown timer dark pattern on JustFab. The advertised offer remains valid even after the timer expires. This pattern is a common tactic—a recent study found such deceptive countdown timers on 140 shopping websites.20
Figure 1. A deceptive countdown timer on JustFab.
The research community has taken note. Recent efforts have catalogued dozens of problematic patterns such as nagging the user, obstructing the flow of a task, and setting privacy-intrusive defaults,1,18 building on an early effort by Harry Brignull (darkpatterns.org). Researchers have also explained how dark patterns operate by exploiting cognitive biases4,20,33 uncovered dark patterns on more than 1,200 shopping websites,20 shown that more than 95% of the popular Android apps contain dark patterns,8 and provided preliminary evidence that dark patterns are indeed effective at manipulating user behavior.19,30
Although they have recently burst into mainstream awareness, dark patterns are the result of three decades-long trends: one from the world of retail (deceptive practices), one from research and public policy (nudging), and the third from the design community (growth hacking).
Figure 2 illustrates how dark patterns stand at the confluence of these three trends. Understanding these trends—and how they have collided into each other—is essential to help us appreciate what is actually new about dark patterns, demystifies their surprising effectiveness, and shows us why it will be difficult to combat them. We end this article with recommendations for ethically minded designers.
Figure 2. The origins of dark patterns.
Deception and Manipulation in Retail
The retail industry has a long history of deceptive and manipulative practices that range on a spectrum from normalized to unlawful (Figure 3). Some of these techniques, such as psychological pricing (that is, making the price slightly less than a round number), have become normalized. This is perfectly legal, and consumers have begrudgingly accepted it. Nonetheless, it remains effective: consumers underestimate prices when relying on memory if psychological pricing is employed.3
Figure 3. Examples of deceptive and manipulative retail practices.
More problematic are practices such as false claims of store closings, which are unlawful but rarely the target of enforcement actions. At the other extreme are bait-and-switch car ads such as the one by a Ford dealership in Cleveland that was the target of an FTC action.14
The Origins of Nudging
In the 1970s, the heuristics and biases literature in behavioral economics sought to understand irrational decisions and behaviors—for example, people who decide to drive because they perceive air travel as dangerous, even though driving is, in fact, orders of magnitude more dangerous per mile.29 Researchers uncovered a set of cognitive shortcuts used by people that make these irrational behaviors not just explainable but even predictable.
For example, in one experiment, researchers asked participants to write down an essentially random two-digit number (the last two digits of each participant’s social security number), then asked if they would pay that number of dollars for a bottle of wine, and finally asked the participants to state the maximum amount they would pay for the bottle.2 They found the willingness to pay varied by approximately threefold based on the arbitrary number. This is the anchoring effect: lacking knowledge of the market value of the bottle of wine, participants’ estimates become anchored to the arbitrary reference point. This study makes it easy to see how businesses might be able to nudge customers to pay higher prices by anchoring their expectations to a high number. In general, however, research on psychological biases has not been driven by applications in retail or marketing. That would come later.
Nudging: The Turn to Paternalism
The early behavioral research on this topic focused on understanding rather than intervention. Some scholars, such as Cass Sunstein and Richard Thaler, authors of the book Nudge,28 went further to make a policy argument: Governments, employers, and other benevolent institutions should engineer “choice architectures” in a way that uses behavioral science for the benefit of those whom they serve or employ.
A famous example (Figure 4) is the striking difference in organ-donation consent rates between countries where people have to explicitly provide consent (red bars) versus those where consent is presumed (orange bars). Because most people tend not to change the default option, the latter leads to significantly higher consent rates.17
Figure 4. Organ-donation consent rates by countries.
Today, nudging has been enthusiastically adopted by not only governments and employers, but also businesses in the way they interact with their customers. The towel reuse message you may have seen in hotel rooms (“75% of guests in this hotel usually use their towels more than once”) is effective because it employs descriptive social norms as a prescriptive rule to get people to change their behavior.16
With the benefit of hindsight, neither the proponents nor the critics of nudging anticipated how readily and vigorously businesses would adopt these techniques in adversarial rather than paternalistic ways. In Nudge, Sunstein and Thaler briefly address the question of how to tell if a nudge is ethical, but the discussion is perfunctory. The authors seem genuinely surprised by recent developments and have distanced themselves from dark patterns, which they label “sludges.”27
Growth Hacking
The third trend—and the one that most directly evolved into dark patterns—is growth hacking. The best-known and arguably the earliest growth hack was implemented by Hotmail. When it launched in 1996, the founders first considered traditional marketing methods such as billboard advertising. Instead, they hit upon a viral marketing strategy: The service automatically added the signature, “Get your free email with Hotmail,” to every outgoing email, essentially getting users to advertise on its behalf, resulting in viral growth.21
Successes like these led to the emergence of growth hacking as a distinct community. Growth hackers are trained in design, programming, and marketing and use these skills to drive product adoption.
Growth hacking is not inherently deceptive or manipulative but often is in practice. For example, in two-sided markets such as vacation rentals, upstarts inevitably face a chicken-and-egg problem: no travelers without hosts and no hosts without travelers. So it became a common practice to “seed” such services with listings that were either fake or scraped from a competitor.22,23
Unsurprisingly, growth hacking has sometimes led to legal trouble. A hugely popular growth hack involved obtaining access to users’ contact books—often using deception—and then spamming those contacts with invitations to try a service. The invitations might themselves be deceptive by appearing to originate from the user, when in fact users were unaware of the emails being sent. LinkedIn settled a class action for exactly this practice, which it used from 2011 to 2014.25
From Growth Hacking to Dark Patterns
But why growth rather than revenue or some other goal? It is a reflection of Silicon Valley’s growth-first mantra in which revenue-generating activities are put aside until after-market dominance has been achieved. Of course, eventually every service runs into limits on growth, because of either saturation or competition, so growth hackers began to adapt their often-manipulative techniques to extracting and maximizing revenue from existing users.
In developing their battery of psychological tricks, growth hackers had two weapons that were not traditionally available in offline retail. The first was that the nudge movement had helped uncover the principles of behavior change. In contrast, the marketing literature that directly studied the impact of psychological tricks on sales was relatively limited because it didn’t get at the foundational principles and was limited to the domain of retail.
The second weapon was A/B testing (Figure 5). By serving variants of Web pages to two or more randomly selected subsets of users, designers began to discover that even seemingly trivial changes to design elements can result in substantial differences in behavior. The idea of data-driven optimization of user interfaces has become deeply ingrained in the design process of many companies. For large online services with millions of users, it is typical to have dozens of A/B tests running in parallel, as noted in 2009 by Douglas Bowman, once a top visual designer at Google:
Figure 5. Hypothetical illustration of A/B testing on a website.
Yes, it’s true that a team at Google couldn’t decide between two blues, so they’re testing 41 shades between each blue to see which one performs better. I had a recent debate over whether a border should be 3, 4, or 5 pixels wide, and was asked to prove my case. I can’t operate in an environment like that. I’ve grown tired of debating such minuscule design decisions. There are more exciting design problems in this world to tackle. —Douglas Bowman
A/B testing proved key to the development of dark patterns because it is far from obvious how to translate an abstract principle like social proof into a concrete nudge (“7 people are looking at this hotel right now!”). Another example: For how long should a fake countdown timer be set (“This deal expires in 15 minutes!” … “14:59” … “14:58” …), so the user acts with urgency but not panic? Online experiments allow designers to find the answers with just a few lines of code.
Money, Data, Attention
Let’s recap. As the online economy matured, services turned their attention from growth to revenue. They used the principles of behavioral influence but subverted the intent of the researchers who discovered those principles by using them in ways that undermined consumers’ autonomy and informed choice. They used A/B testing to turn behavioral insights into strikingly effective user interfaces. In some cases these were optimized versions of tricks that have long been used in retail, but in other cases they were entirely new.
How, exactly, do dark patterns help maximize a company’s ability to extract revenue from its users? The most obvious way is simply to nudge (or trick) consumers into spending more than they otherwise would.
A less obvious, yet equally pervasive, goal of dark patterns is to invade privacy. For example, cookie consent dialogs almost universally employ manipulative design to increase the likelihood of users consenting to tracking. In fact, a recent paper shows that when asked to opt in, well under 1% of users would provide informed consent.30 Regulations such as the GDPR (General Data Protection Regulation) require companies to get explicit consent for tracking, which poses an existential threat to many companies in the online tracking and advertising industry. In response, they appear to be turning to the wholesale use of dark patterns.30
A third goal of dark patterns is to make services addictive. This goal supports the other two, as users who stay on an app longer will buy more, yield more personal information, and see more ads. Apps like Uber use gamified nudges to keep drivers on the road longer (Figure 6). The needle suggests the driver is extremely close to the goal, but it is an arbitrary goal set by Uber when a driver wants to go offline.24 To summarize, dark patterns enable designers to extract three main resources from users: money, data, and attention.
Figure 6. One of Uber’s gamified nudges to keep drivers on the road.
Dark Patterns Are Here to Stay
Two years ago, few people had heard the term dark patterns. Now it’s everywhere. Does this mean dark patterns are a flash in the pan? Perhaps, as users figure out what’s going on, companies will realize that dark patterns are counterproductive and stop using them. The market could correct itself.
The history sketched here suggests that this optimistic view is unlikely. The antecedents of dark patterns are decades old. While public awareness of dark patterns is relatively new, the phenomenon itself has developed gradually. In fact, the darkpatterns.org website was established in 2010.
The history also helps explain what is new about dark patterns. It isn’t just tricky design or deceptive retail practices online. Rather, design has been weaponized using behavioral research to serve the aims of the surveillance economy. This broader context is important. It helps explain why the situation is as bad as it is and suggests that things will get worse before they can get better.
One worrying trend is the emergence of companies that offer dark patterns as a service, enabling websites to adopt them with a few lines of JavaScript.20 Another possible turn for the worse is personalized dark patterns that push each user’s specific buttons.26 This has long been predicted5 but remains rare today (manipulative targeted advertising can arguably be viewed as a dark pattern, but ads are not user interfaces). The absence of personalized UI is presumably because companies are busy picking lower-hanging fruit, but this can change any time.
Recommendations for Designers
Designers should be concerned about the proliferation of dark patterns. They are unethical and reflect badly on the profession. But this article is not a doom-and-gloom story. There are steps you can take, both to hold yourself and your organization to a higher standard, and to push back against the pressure to deploy dark patterns in the industry.
Go beyond superficial A/B testing metrics. Earlier we discussed how designers use A/B tests to optimize dark patterns. But there’s a twist: a design process hyperfocused on A/B testing can result in dark patterns even if that is not the intent. That’s because most A/B tests are based on metrics that are relevant to the company’s bottom line, even if they result in harm to users. As a trivial example, an A/B test might reveal that reducing the size of a “Sponsored” label that identifies a search result as an advertisement causes an increase in the CTR (click-through rate). While a metric such as CTR can be measured instantaneously, it reveals nothing about the long-term effects of the design change. It is possible that users lose trust in the system over time when they realize they are being manipulated into clicking on ads.
In a real example similar to this hypothetical one, Google recently changed its ad labels in a way that made it difficult for users to distinguish ads from organic search results, and presumably increased CTR for ads (Figure 7). A backlash ensued, however, and Google rolled back this interface.32
Figure 7. Google’s recent change to its ad labels.
To avoid falling into this trap, evaluate A/B tests on at least one metric that measures long-term impacts. In addition to measuring the CTR, you could also measure user retention. That will tell you if a different-sized label results in more users abandoning the website.
Still, many attributes that matter in the long term, such as trust, are not straightforward to observe and measure, especially in the online context. Think critically about the designs you choose to test, and when you find that a certain design performs better, try to understand why.
While the overreliance on A/B testing is a critical issue to be addressed, let’s next turn to a much broader and longer-term concern.
Incorporate ethics into the design process. While dark patterns are a highly visible consequence of the ethical crisis in design, resolving the crisis entails far more than avoiding a simple list of patterns. It requires structural changes to the design process.
Start by articulating the values that matter to you and that will guide your design.15 Not every organization will have an identical set of values, but these values must be broadly aligned with what society considers important.
In fact, much of the present crisis can be traced to a misalignment of values between society and companies. Autonomy and privacy are two values where this is particularly stark. Consider frictionless design, a bedrock value in the tech industry. Unfortunately, it robs users of precisely those moments that may give them opportunities for reflection and enable them to reject their baser impulses. Frictionlessness is antithetical to autonomy. Similarly, designing for pleasure and fun is a common design value, but when does fun cross the line into addiction?
Once you have articulated your values, continue to debate them internally. Publicize them externally, seek input from users, and, most importantly, hold yourself accountable to them. Effective accountability is challenging, however. For example, advisory boards established by technology companies have been criticized for not being sufficiently independent.
Everyday design decisions should be guided by referring to established values. In many cases it is intuitively obvious whether a design choice does or does not conform to a design value, but this is not always so. Fortunately, research has revealed a lot about the factors that make a design pattern dark, such as exploiting known cognitive biases and withholding crucial information.4,20 Stay abreast of this research, evaluate the impact of design on your users, and engage in critical debate about where to draw the line based on the company’s values and your own sense of ethics. Rolling back a change should always be an option if it turns out that it didn’t live up to your values.
As you gain experience making these decisions in a particular context, higher-level principles can be codified into design guidelines. There is a long tradition of usability guidelines in the design community. There are also privacy- by-design guidelines, but they are not yet widely adopted.10 There is relatively little in the way of guidelines for respecting user autonomy.
All of this is beyond the scope of what individual designers can usually accomplish; the responsibility for incorporating ethics into the design process rests with organizations. As an individual, you can start by raising awareness within your organization.
Self-regulate or get regulated. Dark patterns are an abuse of the tremendous power that designers hold in their hands. As public awareness of dark patterns grows, so does the potential fallout. Journalists and academics have been scrutinizing dark patterns, and the backlash from these exposés can destroy brand reputations and bring companies under the lenses of regulators.
Many dark patterns are already unlawful. In the U.S., the Federal Trade Commission (FTC) Act prohibits “unfair or deceptive” commercial practices.11 In a recent example, the FTC reached a settlement with Unroll. Me—a service that unsubscribed users’ email addresses from newsletters and subscriptions—because it was in fact selling information it read from their inboxes to third parties.12 European Union authorities have tended to be stricter: French regulator CNIL (Commission Nationale de l’Informatique et des Libertés) fined Google 50 million euros for hiding important information about privacy and ad personalization behind five to six screens.6
There is also a growing sense that existing regulation is not enough, and new legislative proposals aim to curb dark patterns.7 While policymakers should act—whether by introducing new laws or by broadening and strengthening the enforcement of existing ones—relying on regulation is not sufficient and comes with compliance burdens.
Let’s urge the design community to set standards for itself, both to avoid onerous regulation and because it’s the right thing to do. A first step would be to rectify the misalignment of values between the industry and society, and develop guidelines for ethical design. It may also be valuable to partner with neutral third-party consumer advocacy agencies to develop processes to certify apps that are free of known dark patterns. Self-regulation also requires cultural change. When hiring designers, ask about the ethics of their past work. Similarly, when deciding between jobs, use design ethics as one criterion for evaluating a company and the quality of its work environment.
Design is power. In the past decade, software engineers have had to confront the fact that the power they hold comes with responsibilities to users and to society. In this decade, it is time for designers to learn this lesson as well.
Related articles
on queue.acm.org
User Interface Designers, Slaves of Fashion
Jef Raskin
https://queue.acm.org/detail.cfm?id=945161
The Case Against Data Lock-in
Brian W. Fitzpatrick and J.J. Lueck
https://queue.acm.org/detail.cfm?id=1868432
Bitcoin’s Academic Pedigree
Arvind Narayanan and Jeremy Clark
https://queue.acm.org/detail.cfm?id=3136559
Join the Discussion (0)
Become a Member or Sign In to Post a Comment