Opinion
Computing Applications Viewpoint

Forgetting Made (Too) Easy

Considering the implications of digital data removal implementations.
Posted
  1. Introduction
  2. Background
  3. The Good News
  4. The Bad News
  5. Forgetting Made Too Easy
  6. Author
  7. Footnotes
Forgetting Made (Too) Easy, illustrative photo

Technology policy history was made on May 13, 2014, when the Court of Justice of the European Union (CJEU) found a right to be forgotten in the aging 1995 European Union Data Protection Directive. The decision made waves, because it granted EU citizens the right to request the removal of links from Google’s search index that point to identifying content on the Web. Of course, Google and opponents of the right are not happy, but neither are proponents of the right, like myself, who have found potential in a nuanced, well-constructed right to be forgotten. The problem outlined here (there are others) is that the digital right to be forgotten has yet to be developed by common or civil law. It is a generic right with generic exceptions, which is the way all rights are born, but defining and balancing the right to be forgotten has now been delegated to Google—which is in no position to raise a right and certainly has no interest in doing so.

Back to Top

Background

Although the CJEU interpreted the Directive to grant a right to be forgotten for search engines, it had yet to do so in the law’s almost 25-year history. The 1995 Directive required EU member states to put certain protections in place for their citizens within their own national legal systems. The Spanish Data Protection Agency (AEPD) had granted many citizens the right to be forgotten, finding the retention of and access to personal data in these instances to compromise the fundamental right to data protection and digital of persons. In 2010, the AEPD upheld a complaint filed against Google Spain and Google, Inc. by Mario Costeja González involving a 1998 real-estate auction notice related to his social-security debts published by and housed on a newspaper website. The agency dismissed González’s claim against the newspaper explaining its continued publication was "legally justified," but upheld the complaint against Google. The company appealed the AEPD’s decision to the highest court in Spain, where it was punted to the highest court in the EU, the CJEU.

The CJEU first determined that Google is a "data controller," because the company determines the purpose and means of processing personal data by finding, indexing, temporarily storing, and making available Web content. If Google’s activities qualify it as a data controller, it is unlikely that social-media sites and services, as well as third-party data brokers and services, can avoid being labeled data controllers. The CJEU upheld the right as interpreted and applied by the AEPD under Article 12 of the Directive, which guarantees every data subject the right to rectify, erase, or block the processing of data when it does not comply with the Directive, and under Article 14, which guarantees the right to object at any time on compelling legitimate grounds to the processing of data. As a data controller, Google must comply with the Directive and the 28 member states’ various rights to be forgotten.

Back to Top

The Good News

First, the good news is that countries around the world are engaged in an important debate about what kind of communication legal structure will reinforce their distinct cultural values. Which legal balance will make their populations freer, fairer, and more prosperous: one where information may be obscured or even deleted, or one where information may linger for a very long time? More good news is the data was not deleted, merely obscured, and the Directive is being updated with a Regulation that seeks to harmonize data protection across the EU member states. The draft Regulation includes a more explicit right to erasure in Article 17. So, there is still some (albeit very little) hope for something to be done about the bad news.

Back to Top

The Bad News

The bad news is worse than the good news. Implementing the decision effectively creates a user request takedown system for personal information given that no intermediary administrative agency or court must determine that an individual’s right to be forgotten has been infringed. Individuals enforce their right to be forgotten directly against a data controller. Although this is expeditious for users, it is overly burdensome for data controllers and will lead to an exorbitant amount of obscured and deleted information.a


There is no such thing as a frivolous right to be forgotten request—every single takedown notice could legitimately be taken to court.


The right to be forgotten has few established rules or guidance for handling rival interests like expression and historical records. Although some member states have analog/paper versions of the right (generally invoked by those convicted of a crime, who are seeking to prevent others from referencing them in relation to that criminal past), none have actively taken up converting the right to be forgotten to fit a digital world—certainly not to the extent a data controller can know who has a right to retain or delete what data based on what justifications.

The tragic news is that Google (and other data controllers) has been handed a gavel and ordered to decide what the right to be forgotten is without any guidance as to which interests should trump others, when, and why. Google is far removed from the actual information source and has no reason to know why someone posted or maintained access to certain content. The information source is not involved in the obstruction to their content under this system. Not only does Google now carry the huge burden of guessing what the right to be forgotten means as it is flooded with link removal requests, it must restructure its operations to meet these immense compliance costs.

Prior to the ruling, Google directed unhappy users to a help page that tells them to contact the site or service operator to get their problematic content removed. The page then explained that Google might consider removing links only under rare circumstances, such as those that create identity theft threats, but generally requires a court or administrative order to edit search results. When the company receives a takedown order from a governmental body, it simply verifies the legitimacy and complies. This is true for all legal domains except copyright. When Google is notified of allegedly copyright infringing content, it must remove the content immediately to avoid secondary liability.

Now, Google will have to consider each and every user request for removal to determine whether it is a valid right to be forgotten claim with the only guidance from the CJEU that it must take into account amorphous and jurisdiction-specific values like "irrelevant information" and the "public interest."

Back to Top

Forgetting Made Too Easy

There is no such thing as a frivolous right to be forgotten request—every single takedown notice could legitimately be taken to court. Although some kind of jurisprudence will eventually develop within member states as the rare instances when right to be forgotten determinations make their way through various courts, data controllers that do not have the resources or conviction to fight for their right to retain will, in the meantime, simply delete content. Threats of copyright liability have led to a great deal of non-infringing content being removed from the Internet, yet copyright has the luxury of centuries of fair use jurisprudence to offer guidance to those making removal decisions. Not so with the right to be forgotten. We will see data controllers removing content upon request, not wanting to bother with inevitably having to defend in court the decisions they never wanted to make.

The CJEU decision makes forgetting easy. Forgetting should be possible but not easy.

Back to Top

Back to Top

    a. Within the first 31 days, Google received a reported 70,000 requests to remove 250,000 search results linking individuals in the E.U. to information the data subject preferred to be disassociated with. As of April 20, 2015, Google has received 241,963 requests to address 877,322 URLs and removed 308,401 URLs (a rate of 41.5%).

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More