An analysis of hundreds of popular smartphone apps found that 95% of them have user interfaces that use "Dark Patterns," maliciously crafted menus, buttons, and sliders designed to deceive users into either buying goods or services they do not want, or leading them into unknowingly selecting risky privacy settings.
Dark Patterns are a well-known vector for skulduggery on websites, and have been studied and called out over the last decade, most notably by the name-and-shame website darkpatterns.org, founded by Brighton, U.K.-based user experience (UX) specialist and cognitive scientist Harry Brignull. It was Brignull who first coined the term Dark Pattern (DP).
The chief types of malicious activity DPs enable include sneaking unwanted items (like insurance for other goods being bought) into online shopping baskets, signing users up to expensive recurring magazine subscriptions, or misleading people with preselected sliders labelled with baffling double-negative instructions (like "uncheck here not to download the add-on").
How prevalent Dark Patterns are in the smartphone app arena, rather than regular websites, was not known, so a team of human-computer interaction experts led by Linda di Geronimo, formerly of the University of Zurich, Switzerland, (and now technical cooperation manager in the Zurich Research Center of Huawei), set out to find out just how pervasive Dark Patterns are in the mobile arena.
To do this, three of the five-strong team painstakingly tested 240 mobile apps that had all been trending as popular downloads on Google's Play Store in mid-July 2019. To ensure their results were generally applicable, a spread of app types was examined from a variety of categories, including photography, shopping, social media, music, audio, entertainment, and communications.
"Statistically speaking, 240 apps might not be the right amount of applications to represent millions of apps that we can find on the Google store," di Geronimo says. "However, the apps we picked are amongst the most popular and, therefore, they are intended to be a good representation of the users' daily usage of mobile media consumption."
After assessing similar features in each app, such as account creation, log in, privacy settings, and purchase checkout behaviors, the team found that 10% of the apps contained up to two Dark Patterns, 37% incorporated between three to six of them, and 49% contained seven or more.
Commenting on their findings (summarized on GitHub, here), di Geronimo says she and her colleagues had suspected they might find at least one Dark Pattern per app, because so many of the features baked into them are linked to Internet services known to use such tricks. "For instance, we knew that most apps that allow users to log into their system through Gmail or Facebook do not allow the disconnection of this account from within the app. To remove the link, you need to go to a specific Web page provided by Google or Facebook," she says.
This is a kind of Dark Pattern known to user interface analysts as a "Roach Motel"—"easy to get into, but hard to get out of"—and a regularly used type of dark pattern so far identified by researchers like Brignull. Some e-ticketing firms, for instance, like to slip hard-to-get-out-of music magazine subscriptions into gig ticket purchases using the Roach Motel trick, but making accounts hard to log out of, or hard to delete, are considered to be in the same category.
In their mobile analysis, di Geronimo says the top three DP tricks played by the authors of their 240 Android apps were:
1. Instituting a "false hierarchy": A menu option for the app is favored with a big, colorful "OK" button, while other options are given tiny, grayed-out "No" buttons. This trick is also known in User Interface (UI) circles as "visual interference."
2. Preselection of options: The researchers found most apps preselected push and email notifications by default, sometimes dozens of them, leaving users unnecessarily pinged by push services every few minutes. Worse, some apps preselect privacy settings, too.
3. Nagging: This hugely distracting practice constantly interrupts the user with obstructive popups about unrelated tasks, such as asking him/her to add unrequested items to the shopping cart, or an out-of-the-blue request to rate the app.
Why are users being treated so deceptively? Some companies clearly operate with ill intent, says di Geronimo, after undertaking "statistical usage analysis on their users, and finding the best design option that suits them." That may trickle down to pressure UI designers to meet specific goals, she says, like achieving a certain number of clicks, or levels of user engagement. Other firms will unthinkingly copy sites like that, believing it is acceptable software design. "Dark Patterns might not even be seen as such, but just as normal patterns that get used and reused," she says.
It may be, the Zurich team note in their paper, that the human-computer interface (HCI) community's focus on usability is leaving ethics to one side: "A usable application does not necessarily imply an ethical one," they write.
Brignull is not convinced that UI designers are to blame, however. "The design and HCI community has been concerned about deceptive design practices for over a decade now, if not longer. We've tried to fix things by appealing to our peers in industry to opt into ethical codes of practice, or simply by saying 'no'. But that approach just isn't working. The industry is just too big, and strategic decision-making often takes place before the designers even get briefed by their managers."
Action needs to be taken at the app creation end, however, because the Zurich team also found that it was not possible to educate users to spot dark patterns. After playing back videos of their app tests to 589 volunteers online, they found most were still unable to spot DPs. "We are blind to Dark Patterns. The fact that DPs are used almost everywhere does not help this; we have just become accustomed to them, and we are not able to spot them," says di Geronimo.
"In light of the ubiquity of Dark Patterns, we cannot expect users to protect themselves from the sophisticated and predatory tricks of online services. Rather, we, as a society, must focus our efforts collectively to hold companies that use dark patterns accountable," says Arunesh Mathur, an HCI researcher at Princeton University who last year investigated Dark Patterns on shopping websites, and even found some firms offering code implementing DPs as a service.
"Designers should think twice before implementing and using Dark Patterns in their products. Even if they lead to profit in the short term, the backlash from users in the long term might not be worth it," Mathur adds.
There are early signs of a regulatory fightback. In late December, the Netherlands Authority for Consumers and Markets, with help from the European Commission, convinced travel giant Booking.com to quit running nagging popups on its website. These included some "last room available!" claims, alongside some notices that discounts would expire (when they would not). Booking.com now wants to see such rules applied generally to all vendors, which has to be a good thing all round.
There's an even bigger piece of anti-DP ammunition waiting in the wings: Europe's General Data Protection Regulation (GDPR), the privacy protecting law that is emerging as a template for such measures worldwide. Under GDPR, people have the right to read, modify, or delete any data any organization holds on them.
An HCI team from the Massachusetts Institute of Technology and Denmark's Aarhus University has learned how dark patterns are breaching the GDPR in major ways by using DP tricks to obtain user consent for everything from cookies to personal data storage. They found only 11.8% of popups on the consent management platforms used to provide these options are compliant with GDPR rules.
That will almost certainly lead to a regulatory clampdown on DPs, as measures like GDPR take hold worldwide. It can't come soon enough, says Brignull.
"Better regulation is what's needed. Tech companies are commercially motivated. They won't use Dark Patterns if they know Dark Patterns carry a substantial legal risk. That's why rigorous, well-reported academic research like this on Dark Patterns is so important. It paves the way for serious consideration by policy-makers and the legal community."
Paul Marks is a technology journalist, writer, and editor based in London, U.K.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment