Opinion
Security and Privacy

Notice and Choice Cannot Stand Alone

Privacy notice and choice has largely failed us so far because we are not giving it the legal and technical support it needs.

Posted
cookie consent notice on a website

“Notice and Choice” is the much-criticized approach to privacy regulation and self-regulation that has been in widespread use for about three decades. However, despite the failures of notice and choice as a regime, the concept of notice and choice should not be abandoned. It remains an important component of a broader privacy arsenal that should be combined with strong privacy laws and automated tools to provide customized privacy protections for individuals.

The idea behind notice and choice is that data collectors will provide transparent notices about their collection and use of personal information and allow individuals to make informed choices about whether and for what purpose their information will be used. In theory, this approach should allow data subjects to choose for themselves which uses of their personal information to permit (since these preferences are often context-dependent and vary by individual), while also encouraging a market for privacy in which data collectors improve their data practices to be more competitive.

In practice, notice and choice is a fantasy that has largely failed because notices take a long time to read,12 are difficult to understand, and the number of decisions individuals face about the use of their data is overwhelming. Furthermore, it is often hard for people to understand the potential consequences of their privacy-related choices because they lack a detailed understanding of relevant technologies, data flows, and downstream data uses. To make matters worse, notices and choice interfaces are often difficult to find8 and designed to manipulate people into making the most privacy-invasive selections (deceptive patterns or dark patterns13). Given its poor track record, legal scholars,14 privacy advocates, and even regulators10 have been calling for the end of the notice-and-choice regime.

The failure of notice and choice is due in part to the fact that it has largely been left to stand on its own, with minimal legal teeth or technical support, and (in many jurisdictions) without strong baseline privacy laws. In short, the notice and choice regime was setup for failure. For notice and choice to be effective it must be mandatory, with requirements regulators have resources to enforce, and it needs to be embedded in a standardized technology framework that allows people to readily automate their privacy decisions without being constantly bombarded with privacy choices. It is also critical to have baseline legal protections that do not allow data collectors to ask people to consent to data practices that are fundamentally unfair or about which they are unable to make informed decisions.

The idea of automating privacy decisions is not new. After the U.S. Federal Trade Commission began encouraging website operators to post privacy policies in the 1990s, privacy advocates complained that these policies were too long to be useful to users. In response, the World Wide Web Consortium (W3C) developed the Platform for Privacy Preferences Project (P3P), a protocol for encoding privacy policies in a computer-readable XML format and allowing web browsers and other user agents to retrieve these policies, parse them automatically, and use them to inform users or make automated decisions on their behalf. The most widely adopted P3P implementation was built into the Microsoft Internet Explorer 6 web browser in 2001 and used to automate third-party cookie-blocking decisions. I led the P3P working group at W3C and also worked on a research prototype P3P user agent called Privacy Bird that displayed a colored bird and a brief digest of the privacy policy and where it conflicted with a user’s pre-set privacy preferences. Unfortunately, P3P never saw widespread adoption after its release in 2002 because it lacked incentives for adoption and mechanisms for enforcement.4 Indeed, in 2010 when thousands of websites were found to have circumvented browser P3P controls,9 not a single regulator stepped in.5

After P3P had come and gone, a simpler approach to automating privacy choice was proposed: Do Not Track (DNT). Rather than creating a computer-readable privacy policy and sending it to web browsers, DNT sought to transmit a single header from web browsers to web sites to request that a site and any third-party sites that load with it refrain from tracking an individual user. The W3C spent nearly a decade trying to reach consensus on a DNT standard that web browsers and web sites would adopt. Although some web browsers implemented DNT, in practice it was meaningless as few websites paid attention to the DNT headers.7 In the absence of adoption incentives or legal mandates, DNT ultimately failed.

The latest automated privacy choice approach is Global Privacy Control (GPC), which allows users to turn on a setting in their browser (or browser extension) that transmits a GPC signal to automatically opt out of websites selling or sharing their personal information. What is particularly exciting about GPC, is that now for the first time privacy laws are requiring websites to respect automated privacy signals such as GPC. Under the California Consumer Privacy Act (CCPA), websites are required to act on GPC opt-out requests, and in 2022 the California Attorney General began enforcing compliance.2 There are already six other U.S. states that require websites to honor opt-out preferences transmitted through Universal Opt-Out Mechanisms (UOOMs).1 GPC is designed to be compatible with privacy laws around the world. Although GPC currently provides only a single signal, it could be extended to offer multiple signals and provide for more fine-grained choices. The fact that an automated choice mechanism is now enforceable by law is potentially the game changer needed for automated choice mechanisms to have a chance at success. However, GPC is not yet a settings option in the most popular web browsers, although there are privacy-focused browsers and plugins that offer this option or enable it by default.

With the rapid proliferation of mobile apps, smart homes, and Internet of Things (IoT) devices, websites are just the tip of the iceberg when it comes to the collection and use of personal data. Thus, we have even more of a need for automated tools that can help users signal their preferences about sharing and using their data without being bombarded by requests from every smart device they walk by throughout the day. Researchers have explored mobile app privacy agents that automate app permissions settings11 and IoT privacy agents that allow users to manage privacy settings for IoT devices in their environment.6 However, we currently lack incentives for companies to build automated privacy choice frameworks into their products.

The current generation of deployed notice and choice tools are simple but lack flexibility. More research is needed on how to build tools that can operate with minimal user input after their initial quick and easy configuration, perhaps driven by machine-learning approaches that learn a user’s preferences over time and can extrapolate based on the user’s current context or the preferences of similar users.15 These tools should allow users to occasionally grant exceptions to allow the use of their data when they find it beneficial. However, these exceptions should be granted because users want to provide their data (for example, I want to provide my location when I use a mapping service because I want to view my location on the map) rather than because services break when data is withheld (for example, some websites exhibit strange behavior or stop working when third-party cookies are blocked, encouraging users to override their cookie blockers) or because users are constantly bombarded with requests that they swat away without thinking (for example, cookie banners, another notice and choice mechanism that has been largely ineffective as a privacy tool3). And when exceptions are granted, data should be used only for the purposes the user requests (for example, the map service should not also use my location to serve me location-targeted ads unless I have specifically requested them). Recent research has demonstrated the utility of  “generalizable active privacy choice” interfaces for GPC that allow users to send GPC signals automatically to websites a user visits according to criteria such as type of website, type of data collected, user’s self-described privacy profile, or user’s privacy profile learned by the system.15

Even if we design fantastic tools, we will need incentives or legal mandates for them to be made readily available to end users and their signals respected by data collectors. We need UOOMs built into every web browser and their signals respected by every website. We need IoT devices that send and receive standardized privacy signals to well-designed user agents. We need enforceable penalties for data collectors that fail to honor automated signals or manipulate users into consenting to data practices. And, importantly, we need strong baseline privacy regulations that restrict the use of personal information without individual consent and prohibit some personal information uses altogether.

While notice and choice as a regime has largely failed to live up to its promises to date, if bolstered by appropriate laws, technology standards, and easy-to-use interfaces, the notice and choice concept could be a useful tool in our future privacy toolbox.

    References

    • 1. Adams, S. and Gray, S. Survey of current universal opt-out mechanisms. Future of Privacy Forum. (Oct. 12, 2023); https://bit.ly/3ZSVo4s
    • 2. AG Press Office. Attorney General Bonta Announces Settlement with Sephora as Part of Ongoing Enforcement of California Consumer Privacy Act. (Aug. 24, 2022); https://bit.ly/4dApSv6
    • 3. Cranor, L.F. Cookie monster. Commun. ACM 65, 7 (July 2022); 10.1145/3538639
    • 4. Cranor, L.F. Necessary but not sufficient: Standardized mechanisms for privacy notice and choice. J. Telecommun. High Technol. Law 10, (2012); https://bit.ly/3Yb0BDs.
    • 5. Cranor, L.F. P3P is dead, long live P3P! This Thing blog. (Dec. 3, 2012); https://bit.ly/480Bw19
    • 6. Das, A. et al. Personalized privacy assistants for the Internet of Things: Providing users with notice and choice. IEEE Pervasive Computing 17, 3 (Jul. 2018).
    • 7. Fleishman, G. How the tragic death of Do Not Track ruined the web for everyone. Fast Company (Mar. 7, 2019); https://bit.ly/3ZUn6Ob
    • 8. Habib, H. et al. “It’s a scavenger hunt”: Usability of websites’ opt-out and data deletion choices. In Proceedings of CHI 2020 (2020); 10.1145/3313831.3376511
    • 9. Leon, P.G. et al. Token attempt: the misrepresentation of website privacy policies through the misuse of P3P compact policy tokens. In Proceedings of the 9th Annual ACM Workshop on Privacy in the Electronic Society (WPES '10), (2010), 93104; 10.1145/1866919.1866932
    • 10. Levine, S. Toward a Safer, Freer, and Fairer Digital Economy: How Proactive Consumer Protection Can Make the Internet Less Terrible. Fourth Annual Reidenberg Lecture, Fordham Law School. (Apr. 17, 2024); https://bit.ly/4dGZWOl
    • 11. Liu, B. et al. Follow my recommendations: A personalized privacy assistant for mobile app permissions. In Proceedings of the 12th Symp. on Usable Privacy and Security (SOUPS 2016). (2016); https://bit.ly/3U4yF1C
    • 12. McDonald, A.M. and Cranor, L.F. The cost of reading privacy policies. I/S: A J. of Law and Policy for the Information Society 4, 3 (Winter 2008–09); https://bit.ly/3XYth10
    • 13. Narayanan, A. et al. Dark patterns: past, present, and future. Commun. ACM 63, 9 (Sept. 2020); 10.1145/3397884
    • 14. Rothchild, J.A. Against notice and choice: The manifest failure of the proceduralist paradigm to protect privacy online (or anywhere else) (Feb. 20, 2018). Cleveland State Law Rev.  (2018); https://ssrn.com/abstract=3126869
    • 15. Wijesekera, P. et al. Contextualizing privacy decisions for better prediction. In Proceedings of CHI 2018 (2018); 10.1145/3173574.3173842
    • 16. Zimmeck, S. et al. Generalizable active privacy choice: Designing a graphical user interface for global privacy control. In Proceedings on Privacy Enhancing Technologies Symp. (PoPETS) (2024); 10.56553/popets-2024-0015

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More