Show Summary Details

Page of

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, COMMUNICATION ( (c) Oxford University Press USA, 2020. All Rights Reserved. Personal use only; commercial use is strictly prohibited (for details see Privacy Policy and Legal Notice).

date: 29 March 2020

The Right To Be Forgotten

Summary and Keywords

The right to be forgotten is an emerging legal concept allowing individuals control over their online identities by demanding that Internet search engines remove certain results. The right has been supported by the European Court of Justice, some judges in Argentina, and data-protection regulators in several European countries, among others. The right is primarily grounded in notions of privacy and data protection but also relates to intellectual property, reputation, and right of publicity. Scholars and courts cite, as an intellectual if not legal root for the right to be forgotten, the legal principle that convicted criminals whose sentences are completed should not continually be publicly linked with their crimes.

Critics contend that the right to be forgotten stands in conflict with freedom of expression and can lead to revisionist history. Scholars and others in the southern cone of South America, in particular, have decried the right to be forgotten because it could allow perpetrators of mass human rights abuses to cover up or obscure their atrocities. On the other hand, those in favor of the right to be forgotten say that digital technology preserves memory unnaturally and can impede forgiveness and individual progress. The right to be forgotten debate is far from resolved and poses difficult questions about access to, and control of, large amounts of digital information across national borders. Given the global nature of the Internet and the ubiquity of certain powerful search engines, the questions at issue are universal, but solutions thus far have been piecemeal.

Although a 2014 decision by the Court of Justice of the European Union (EU) garnered much attention, the right to be forgotten has been largely shaped by a 1995 European Union Directive on Data Protection. In 2016, the EU adopted a new General Data Protection Regulation that will take effect in 2018 and could have a major impact because it contains an explicit right to be forgotten (also called right to erasure). The new regulation does not focus on the theoretical or philosophical justification for a right to be forgotten, and it appears likely the debate over the right in the EU and beyond will not be resolved even when the new rule takes effect.

Keywords: Google, Internet search engine, data protection, privacy, freedom of expression, memory

Google Spain SL and Google Inc. v. AEDP and Mario Costeja González

The most prominent judicial decision on the right to be forgotten was issued by the Court of Justice of the European Union in May 2014. The Court held that Google was obligated by European Union law to remove from its search-engine results links to two newspaper articles referencing a real-estate auction to satisfy public debts ostensibly owed by a Spanish lawyer and calligrapher named Mario Costeja González. The case is important as an interpretation and application of Directive 95/46/EC, commonly known as the Data Protection Directive, issued by the European Parliament and the Council of the European Union in 1995.

The Data Protection Directive sought to establish broad and uniform parameters across the European Union (EU) for the processing of personal data, which includes “collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking erasure or destruction” of information relating to identifiable persons (European Commission, 1995). The Directive requires that data controllers ensure their use of personal information is “adequate, relevant, and not excessive in relation to the purposes for which they are collected” (Article 6) and that they allow individuals to rectify, erase, or block data about themselves that is inaccurate or incomplete (Article 12).

Costeja González approached Spain’s La Vanguardia newspaper in 2009, requesting the two articles about the real-estate auction be removed from the newspaper’s website. Costeja González said that the debts were not actually his and, in any case, they had long since been satisfied. The publisher declined to remove the articles, stating that the publication of the notices was done in 1998 in print, and later on the Internet, at the order of the Spanish Ministry of Labour and Social Affairs. Costeja González next filed a complaint with the Agencia Española de Protección de Datos (AEDP), which concluded in 2010 that the newspaper publisher was legally justified in keeping the notices online, but that Google should remove references and links to the articles from its search-engine results. Google then appealed to Spain’s National High Court, which referred a series of questions to the European Union Court of Justice.

In 2013, a designee of the Court of Justice, the Advocate General, issued an advisory opinion that recommended the Court of Justice hold against Costeja González. The Advocate General concluded that the Data Protection Directive, which was written and adopted at the dawn of the Internet age, should not apply to Internet search engines in the way requested by Costeja González because it would unduly restrict freedom of expression under the European Convention for the Protection of Human Rights and Fundamental Freedoms. To reach this conclusion, the Advocate General drew on jurisprudence from the European Court of Human Rights, among other sources.

The Advocate General concluded that the Data Protection Directive did not create a general right to be forgotten but only narrowly granted a right to rectify, erase, or block data that was inaccurate or incomplete. The Advocate General acknowledged that the European Commission had proposed in 2012 a General Data Protection Regulation that included a right to be forgotten in its Article 17. However, the Advocate General noted, that proposal was controversial and has not yet been implemented.

In May 2014, the Court of Justice surprised many observers by rejecting the Advocate General’s recommendation and instead answering the Spanish National High Court’s questions in favor of Costeja González. The Court concluded that the Advocate General’s focus on Article 12 of the Data Protection Directive was incomplete, and that equal attention needed to be paid to Article 6. Thus the Court stated that, not only do individuals have a right to rectify, erase, or block data that is inaccurate or incomplete, but they also have a right to ensure that their personal data used by controllers is adequate, relevant, and not excessive. The Court concluded that the links to articles about Costeja González’s satisfied debts were inadequate, no longer relevant, or excessive, and therefore had to be removed by Google, even though the news articles themselves could remain online. In an important caveat, the Court also stated that if Costeja González had been a prominent person in public life, then the public’s interest in having access to the information through the search-engine links may have outweighed the individual’s right to have the links removed.

Reaction to the Google v. Costeja González Case

Following the European Court of Justice decision, Costeja González was interviewed and quoted extensively in various media. He told an interviewer that he had always been a supporter of freedom of expression and that he actually did not favor a broad right to be forgotten (Lorenzo, 2014). Instead, he said, he only asked that information on the Internet be suppressed when it affected the honor, dignity, reputation, or privacy of a person, and the information was irrelevant and personally or professionally harmful. Costeja González added that he was not in favor of a right to be forgotten for public persons, such as politicians, who committed misdeeds and then wanted them to be erased from the public consciousness.

He also noted the irony that he and his supposed debts had become known around the world when his original intention was to make that information obscure. This phenomenon has come to be known as the Streisand Effect, named for the American actress, Barbra Streisand, whose attempts to suppress photographs of her California mansion ended up drawing much more attention to the images than otherwise might have happened.

In the wake of the Court of Justice decision, many commentators criticized the judgment and its consequences. A common contention was that the decision transferred too much authority to Google to determine which links to information should be suppressed. The British House of Lords made this criticism, as did others (European Union Committee, 2014). Meanwhile, the Harvard Law Review argued that the decision was correctly made based on the text and strong privacy values of the Data Protection Directive, and that critics should focus not on contending that the Court was wrong but rather on what the proper balance should be between privacy and other values, including freedom of expression, in future data protection rules (Harvard Law Review Association, 2014).

Several British media outlets, including the Telegraph and the BBC, objected to the requirement that Google remove certain of their links from search results. As a form of opposition to the right to be forgotten ruling by the Court of Justice, the Telegraph and the BBC began to maintain and publish lists of their own articles removed from Google search results. While the BBC listed only the links by month, the Telegraph included short summaries of the articles along with the links. In 2015, Google requested that the BBC clarify that, under the right to be forgotten, links were not entirely removed from Google’s indexes, but only from search results for specific individuals’ names. A de-listed article might still appear in search results if the search query were a term other than the requester’s name.

In France, the National Commission on Computing and Liberty interpreted the Court of Justice’s decision to mean that Google should remove requested information from all of its various country search engines, not just those designated for Europe. Google objected, saying that such an extension of the rule was beyond the National Commission’s power because it would extend the Court of Justice’s opinion to the entire world. Google’s argument about jurisdiction, as well as other arguments about freedom of information and censorship, was rejected. The potential for the data protection regimes of individual nations to apply the right to be forgotten in different ways has spurred interest in a single data protection rule for all of the EU.

The Costeja González decision has had ripple effects. In 2015, the U.K. Information Commissioner’s Office ordered Google to remove from its search-engine results links to nine articles that discussed previously delisted content. So journalists reporting on successful delisting requests have had their own reports delisted, under the U.K. regulator’s conclusion that the reports about the delisted content were themselves in violation of the right to be forgotten, because they repeated the originally objectionable information.

A non-profit group, Consumer Watchdog, petitioned the U.S. Federal Trade Commission (FTC) to guarantee U.S. Internet users a right to be forgotten similar to that crafted by the Court of Justice. But the FTC has not acted on the request, and the prevailing view in the United States is that the right to be forgotten is precluded by the free speech clause of the First Amendment to the U.S. Constitution and legal principles relating to access to information.

Article 29 Working Party Recommendations

The European Commission’s Article 29 Data Protection Working Party, which was established in the Data Protection Directive, published a set of guidelines for implementing the Court of Justice opinion. The Working Party recommendations are aimed at national data protection authorities who would hear appeals from decisions by data controllers (i.e., search engines) and fall into 13 categories:

  1. 1. Successful delisting requests should be based on the impact that an Internet search for a specific individual’s name, including established pseudonyms and nicknames, can have on that individuals’ private life.

  2. 2. Delisting requests from individuals who play a role in public life are disfavored if there is a public interest in access to the information. Playing a role in public life can be determined by examining whether the individual is a politician, senior public official, businessperson, or member of a regulated profession. Public figures may also be included. Determinations about delisting could hinge on whether having access to the information in question would protect the public from improper conduct.

  3. 3. Delisting is more likely if the requester was a minor at the time of publication of the information, and determinations should take into account the best interest of the child.

  4. 4. Delisting is more likely if the information is inaccurate, inadequate, or misleading. If the accuracy of the information is unclear and is the subject of a court process, for example, decision makers should wait for a resolution. Opinions should be distinguished from factual information.

  5. 5. Relevance and proportionality are key considerations. Information that is 15 years old is less relevant than information that is 1 year old. Information relating to private life is less relevant than information relating to public or professional activity. Data protection authorities are not empowered to resolve complaints for hate speech or libel, and concerns about those issues should be referred to the police and courts.

  6. 6. Delisting is more likely in case of sensitive data such as information about a person’s health, sexuality, or religious belief.

  7. 7. Delisting is more likely if the information is out of date and thus has become inaccurate.

  8. 8. While there is no requirement for a data subject to demonstrate prejudice or harm, delisting is more likely in case of such a demonstration. For example, a minor crime or misconduct that was not the subject of public debate might have a disproportionately negative impact, and thus delisting would be favored.

  9. 9. Delisting is more likely if the information would put the data subject at risk, such as for theft or stalking.

  10. 10. Delisting is appropriate if the only legal basis for publishing the original information was consent, but the consent has since been withdrawn.

  11. 11. Data protection authorities should take into consideration whether the original information was posted for a journalistic purpose, though journalistic purpose alone is not a basis for denying a delisting request.

  12. 12. Delisting is generally not appropriate where a public authority has a legal obligation to publish certain information.

  13. 13. Delisting is more likely for minor crimes that happened long ago and less likely for more recent and more serious crimes.

The Article 29 Working Party took a broad view of the scope of the Court of Justice’s opinion, suggesting not only that search engines should delist certain information globally, but also that even non-EU citizens might be eligible to request delisting. The Working Party suggested that search engines make available their criteria for delisting decisions and also publish data about the results of their decisions.

General Data Protection Regulation and New Directive

In April 2016, the European Council and the European Parliament adopted Regulation 2016/679, known as the “General Data Protection Regulation,” and its associated Directive 2016/680. The General Data Protection Regulation will take effect, without the necessity of any implementing action on the part of member states, on May 25, 2018. The new directive, meanwhile, should be incorporated into the national law of member EU states by May 6, 2018. The regulation and new directive will replace the 1995 Data Protection Directive.

The General Data Protection Regulation and Directive resulted from four years of work. In early 2012, the European Commission, which has executive and administrative roles within the EU system, proposed a new General Data Protection Regulation that would replace Directive 95/46/EC. As a regulation rather than a directive, the proposal would not merely set an expectation for nations in adopting data protection rules, but it would establish a uniform law across the EU. The regulation contains, in its Article 16, a right to rectification of inaccurate personal data. But the provision that has drawn the most attention is Article 17, which establishes a “Right to erasure” (right to be forgotten).

Article 17 states that individuals have the right to require data controllers to erase information if the data subject objects or withdraws consent, or if the information is no longer necessary for its original purpose. The provision also requires that data controllers inform other controllers of the request for erasure. Article 17 includes exceptions from the right if processing the information is necessary for freedom of expression, the public interest, or for historical, scientific, and statistical purposes (European Commission, 2016).

Meanwhile, Article 18 of the General Data Protection Regulation grants a right of restriction to information if the data subject contests accuracy. The restriction is to last only until the data controller can verify accuracy. Article 18 also creates a right of restriction if processing of personal data is unlawful or is no longer needed by the data controller but is still required by the data subject for a legal claim. Article 19 requires that, in case of erasure or rectification of personal data, data controllers inform recipients of the personal data unless this proves impossible or “involves disproportionate effort.” Under Article 19, data subjects may also request that data controllers inform the subjects about notice given to recipients of the information.

In the face of both criticism and praise, EU Justice Commissioner Viviane Reding, in early 2012, outlined why she believed the right to be forgotten in Article 17 was necessary. She contended that individuals should have the right to control their own online identities because “even tiny scraps of personal information can have a huge impact, even years after they were shared or made public” (Reding, 2012).

The initial 2012 proposal placed special emphasis on the right to erase information that was published when an individual was a minor. But, in a 2014 amendment, the European Parliament removed the reference to information published when one was a minor. The final version of the regulation does include, in its preliminary recitals (paragraph 65), the statement that the right to be forgotten “is relevant in particular where the data subject has given his or her consent as a child and is not fully aware of the risks involved by the processing, and later wants to remove such personal data, especially on the internet” (European Commission, 2016).

The European Parliament, at one point during the negotiations, removed reference to the right to be forgotten and instead proposed calling it, simply, a right to erasure. However, the phrase “right to be forgotten” was restored in the final version of Article 17, along with the right to erasure. The European Parliament would have limited the right in the case of old technology, where erasure would not be possible. The 2014 proposal would have required the data subject’s identity to be verified (European Parliament, 2014). Those changes were removed or watered down in the final version.

Even though the General Data Protection Regulation underwent much discussion and scrutiny, some questions remain unanswered. For example, it remains unclear whether social media platforms such as Facebook and Twitter are considered data controllers, and whether they would be obligated to remove content posted by one of their users about another person at the other person’s request. Additionally, the regulation purports to apply globally to companies that operate in the EU and control data about individuals there. However, the practical balance between that broad reach and the regulation’s attempts to protect freedom of expression remains to be defined in detail. A key question is whether existing EU rules for eCommerce, which are procedurally protective of free expression, apply to right-to-be-forgotten requests under Article 17 of the General Data Protection Regulation (Keller, 2015).

Google’s Response

After the Court of Justice decision, Google published a form for requesting removal of URLs from search results. It was reported that, from May through November 2014, Google received 174,000 requests for removal of more than 600,000 links. Google removed approximately 41% of the requested links, but specified that it would not remove links to information about scams, criminal conduct, professional malpractice, or conduct by public officials. Other search engines, including Yahoo and Bing, also published forms to be used for requesting removal of links from search results (Clark, 2014).

Google solicited a group of experts to advise it on issues relating to the right to be forgotten. The group, which included Frank LaRue, United Nations Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, published a report in February 2015. After a series of public and private meetings around Europe, the Advisory Committee recommended that Google consider four factors when making determinations about “delisting” requests. First, Google should consider the individual requester’s role in public life. A person who plays a prominent role, such as politicians, corporate executives, athletes and entertainers, should be less likely to have his or her delisting request granted. Conversely, persons who play no role in public life should be more likely to have their delisting requests granted. Meanwhile, delisting requests by persons who play a limited role in public life, including public employees, school leaders, and individuals who are thrust into the public eye unwillingly, must be considered in light of the overall context.

Second, according to the Advisory Committee’s report, Google should generally favor delisting in cases of information about an individual’s intimate or sex life, personal financial information, private contact or identification information, private information about minors, information that is false or poses a risk of harm, or images and videos that might pose particular privacy invasion concerns. On the other hand, the Committee recommended that Google should generally disfavor delisting requests when the information relates to political discourse or governance, religious or philosophical discourse, public health and consumer protection, criminal activity, matters of general interest, information that is true, information that is key to the historical record, and information related to scientific inquiry or artistic expression.

Third, the Advisory Committee recommended that Google consider the source of the information requested for delisting. When the information comes from a journalistic or government source, delisting should be disfavored. Similarly, delisting is disfavored when the source of the information is one to which the subject has consented or exercises some control.

Fourth, the Committee advised Google to take into consideration the passage of time. The Committee stated that time, combined with fading from public view, might favor delisting in the case of a former public persona. However, the Committee specifically mentioned that commission of crime would remain relevant even with the passage of a long period of time.

The Committee had further procedural recommendations, including that the request form should be simple and the search engine should notify media upon, or even prior to, delisting. A majority of Committee members asserted that Google should not be required to delist links from all its search engines around the globe based on a European request (Google, 2015b).

Google took one step toward meeting interests related to the right to be forgotten when it set up a form for individuals to request delisting of links to revenge porn. Google stated that this option would be available for individuals who are the subject of sexually oriented content posted online without their consent. Google noted that this material sometimes ends up on sextortion web sites that require a fee for removal. Google also said it would generally remove links, upon request, to national identification numbers such as the U.S. Social Security number, bank account numbers, credit card numbers, and images of signatures, but not addresses, telephone numbers and birthdates.

Results of Right to Be Forgotten Requests

In its Transparency Report, Google reported that, as of October 2015, it had reviewed 1.16 million URLs for possible delisting based on 327,234 requests. Google removed links to 41.8% of the requested URLs from results in its Google Search, Image Search, Video Search and Google News services. The top 10 sites from which links were delisted include Facebook, YouTube, Google Groups, Google+, Twitter and several directories.

As examples, Google highlighted the following summaries for links that were removed:

  • A news story about a minor crime in the United Kingdom, followed by a second story about the removal.

  • Information about an individual in Belgium who was convicted of a serious crime in the last five years, but whose conviction was quashed on appeal.

  • An article about a political activist who was stabbed at a protest in Latvia.

  • An article about the conviction of a teacher in Germany for a minor crime more than 10 years ago.

  • A page showing the address of a woman in Sweden (even though Google states elsewhere that addresses are the kind of information generally not removed).

  • An article, which included the woman’s name, about the decades-old murder of the husband of a woman in Italy.

  • An article about the rape of a person in Germany.

  • Pages that discuss the victim of a crime in Italy that occurred decades ago.

  • A page that reposted an image originally posted by a woman in Italy.

  • An article reporting on a contest in which a man in Belgium participated when he was a minor.

Meanwhile Google also included examples of links that were not removed even after requests that Google do so:

  • Recent articles about a decades-old conviction of a public official in Hungary.

  • Articles about a lawsuit brought by a prominent business person in Poland against a newspaper.

  • Articles reporting on the conviction of a priest in France for possession of child sexual abuse images and the priest’s banishment from the church.

  • Articles about a couple in Austria accused of business fraud.

  • Multiple articles about the arrest of a professional in Italy for financial crimes.

  • Articles reporting on embarrassing content posted on the Internet by a media professional in the United Kingdom.

  • Articles about an individual who was fired from a job in the United Kingdom for sexual misconduct.

  • Articles and blog posts about an individual in the Netherlands accused of abusing welfare services.

  • A copy of an official state document reporting on fraud committed by an individual in Italy.

  • A petition by a student organization demanding removal of a public official.

  • Articles reporting on an investigation for sexual abuse by a former clergyman in the United Kingdom (Google, 2015a).

The Guardian reported in mid-2015 that it discovered hidden data in source code on archived versions of the Google Transparency Report that allowed access to more detail about requests for delisting. The Guardian asserted that the data covered three fourths of all removal requests to Google so far, and that the data had not been previously released publicly.

The Guardian reported that 95.6% of requests were for removal of private personal information, and that 48% of these requests were granted. However, much lower rates were granted in other categories: child protection (17%), political (23%), public figure (22%), and serious crime (18%). The Guardian asserted that, contrary to fears of some, the overwhelming majority of granted requests did not concern public officials or other matters of public concern (Tippmann & Powles, 2015).

The data inadvertently released by Google were among the items of information requested by 80 scholars in an open letter to Google on the 1-year anniversary of the Court of Justice’s decision in the Costeja González case. The scholars argued that Google had been required by the Court of Justice to make decisions about privacy and access to information that concerned the public, and yet there was no public scrutiny of the decision-making process. The scholars said they sought detailed information in 13 specific categories that touched on the rates of delisting for various types of requests and identities of requesters, and reasoning behind those decisions.

Requesters who are not satisfied with Google’s denial of a delisting request can appeal the decision to their national data protection authority, though that reportedly happens in only 1% of cases.

The Case of Argentina

Various national and regional jurisdictions have addressed issues surrounding the right to be forgotten. A prominent and drawn-out battle took place in Argentina, where lawyers brought scores of lawsuits against Google and Yahoo on behalf of celebrity clients, including numerous female models and entertainers who objected to search engine results for their names that appeared to link them to sexually oriented websites.

Led by the Argentine lawyer Adolfo Martín Leguizamón Peña, litigation against Google and Yahoo reached a high-water mark between 2009 and 2012, with several trial and appellate court opinions in favor of celebrity models, actresses, and entertainers. These victories, however, were overwhelmingly based on rights of publicity, copyright, privacy, reputation, and data protection rather than an explicit right to be forgotten. In one of the most prominent cases, involving the pop singer Virginia Da Cunha, an appellate court judge did explicitly discuss the right to be forgotten, but ultimately Da Cunha lost the appeal.

Da Cunha and her musical group, Bandana, became well known in Argentina through the television talent competition “Popstars,” in 2001. Later, Da Cunha performed with another band called Virgin Pancakes, and she also appeared as a television personality and DJ. In 2009, she prevailed in trial court against Google and Yahoo, based on her claim that they inappropriately linked her name and photographs in search-engine results to sites for pornography and sex trafficking. In 2009, a Buenos Aires trial judge awarded Da Cunha a modest monetary judgment against Google and Yahoo and ordered the search-engine results to be modified so that Da Cunha’s name and photograph did not appear linked to sexually oriented websites. For a time, the search engines blocked all results for Da Cunha’s name on their Argentine sites. The trial judge based the determination on the conclusion that Da Cunha had a right to control the use of her own image.

In 2010, however, a three-judge appellate court voted 2–1 to reverse the trial court decision in favor of Da Cunha. The appellate panel ultimately determined that search engines could not be held responsible for third parties’ use of the names and images of celebrities like Da Cunha. One of the judges who held against Da Cunha relied heavily on the immunity principle embodied in Section 230 of the U.S. Communications Decency Act and a similar provision in the EU’s 2000 Electronic Commerce Directive.

Another judge also voted against Da Cunha but favorably expounded on the right to be forgotten. The judge, Ana María R. Brilla de Serrat, invoked legal concepts and scholarship from Italy in concluding that the right to be left alone includes a right to control use of one’s image, and also that a right to be forgotten exists. She argued that, in Italy, convicted criminals who have served their sentences may not be linked forever with their crimes, and that publication of the details of a conviction after a sentence is served would only be permitted if there were a new and legitimate reason for public interest. Given that Brilla de Serrat voted against Da Cunha, though, her discussion of the right to be forgotten was extraneous.

Although celebrity clients achieved several victories over Google and Yahoo, the Supreme Court of Argentina, in late 2014, struck a blow to the right to be forgotten when it decided against the model María Belén Rodriguez. The court held that search engines do not have a general obligation to obscure or hide search results linking individuals such as Rodriguez to objectionable websites. However, the court did allow that a search engine may be obligated, upon a specific request, to remove results that include child pornography or information that would facilitate criminal conduct. Leguizamón and his partner, Alejandro Arauz Castex, did not concede that the battle over the right to be forgotten was over, even after the Supreme Court decision.

Some scholars contend that Latin America was forging a third way that struck a moderate balance between the extraordinary free-speech protections of the United States and the stringent privacy protections of Europe.

Related Efforts in the United States

Colloquially called the online eraser law, California’s S. B. 568 took effect in January 2015, to require social media and other websites to remove content posted by minors upon the minors’ request. Despite claims by free speech proponents that the law violates the First Amendment, the statute requires that minors be notified they have the right to request removal of any content or information they themselves posted. The law does not apply to content posted by a third party, or content for which the minor was willingly compensated. The law also does not require removal of content that has been anonymized.

Several U. S. states also have enacted legislation to combat mugshot websites. Although jail booking mugshots are a staple of news reporting and are generally publicly available, some state legislatures have objected to websites that post mugshots for entertainment or charge the subjects of the mugshots a fee to have them removed from the website. The laws try to walk a fine line between preserving free speech about legitimate news of crimes, on the one hand, and cutting off profiteering that can approach extortion on the other. At least seven states have adopted laws targeting the mugshots website industry, but questions remain about the constitutionality of those laws.

In addition, a cottage industry of reputation management services has sprung up. For a fee, these businesses will assist individuals in improving their online profiles by taking measures to drive negative results lower in search-engine lists related to their names.

The micro-blogging site Twitter instituted its own version of the right to be forgotten with respect to politicians’ deleted tweets. Twitter shut down the account of politiwoops, a popular handle that preserved politicians’ deleted tweets. Twitter’s rationale was that politicians’ free speech rights included the right to delete their own messages and not have those tweets still appear on Twitter, under another account. But public commentary sharply criticized Twitter’s action. Some websites continued to preserve politicians’ tweets, including an infamous accidental tweet from former Congressman Anthony Weiner, but Twitter prevented certain accounts from using its application programming interface (API).

In response to their own fears about the permanence of digital content, Stanford University students Evan Spiegel, Bobby Murphy, and Reggie Brown, in 2011, started the forerunner to Snapchat, which promised to delete users’ posts within a few seconds. On May 9, 2012, Spiegel wrote on Snapchat’s blog about the mindset behind Snapchat: “And after hearing hilarious stories about emergency detagging of Facebook photos before job interviews and photo-shopping blemishes out of candid shots before they hit the Internet (because your world would crumble if anyone found out you had a pimple on the 38th day of 9th grade), there had to be a better solution.” By 2015, Snapchat reportedly had 100 million active daily users and was valued at $16 billion. But critics have accused Snapchat of being a haven for sexually explicit content shared among minors, and in 2014, Snapchat settled a case brought by the FTC alleging that Snapchat deceived customers by promising their content would be ephemeral when, in fact, some content was stored on Snapchat. In addition, third-party apps could be used to retain Snapchat content indefinitely (Federal Trade Commission, 2014).

Alternative Bases for the Right to Be Forgotten

Although the right to be forgotten thus far has primarily been based in notions of privacy, some scholars and policymakers have argued that a better approach would be a right to define one’s online identity that would be based in fundamental economic, social, and cultural rights recognized by international human rights law. Article 15 of the International Covenant on Economic, Social, and Cultural Rights recognizes the right to enjoy the benefits of scientific progress and its applications (REBSPA). REBSPA is closely tied to the right to take part in cultural life, which emphasizes each individual’s prerogative to control his or her identity. REBSPA seeks to make the benefits of technology available to all, including disadvantaged individuals.

Article 15 seeks to balance public and private interests. Thus a right to define one’s online identity based on Article 15 would recognize individuals’ needs for certain information to be made obscure, but would also account for public interests in access to information, freedom of speech, and preservation of history and culture. A human rights based approach would bring to bear international obligations to respect, protect, and fulfill REBSPA. Article 15 and other provisions of the International Covenant also ask developed nations to respect and protect the interests of developing nations.

REBSPA itself remains to be defined in detail, but some experts have suggested that the right to manage one’s online identity fits squarely within Article 15.

A related concept—and possible avenue to serve the interests of the right to be forgotten in the United States notwithstanding the First Amendment—is practical obscurity. The phrase practical obscurity came to prominence in a 1989 U.S. Supreme Court decision, Department of Justice v. Reporters Committee for Freedom of the Press. The case stemmed from a journalism organization’s challenge to the government’s decision not to release the criminal identification record, or rap sheet, of Charles Medico.

Examining the Freedom of Information Act, the Supreme Court concluded that, while individual pieces of information contained in the rap sheet might well be public records, release of the compilation of all the information into a rap sheet posed a risk of an unwarranted invasion of personal privacy for the subject of the records. Thus, the Court denied the journalists’ claim that the rap sheet should be released publicly. Practical obscurity referred to the notion that the individual pieces of information could be accessed publicly, but that an individual would not likely go to the trouble to compile a comprehensive rap sheet, and the government’s compilation should not be made easily accessible.

In the context of the Internet, privacy proponents argue that practical obscurity should apply to prevent easy access to information that may well be public but that poses a risk of harm. Meanwhile, others contend that the notion of practical obscurity is in direct conflict with transparency, access to information, and free expression.

Many U.S. states have a well-developed concept of the right of publicity, which protects the use of an individual’s name or likeness in commerce. But there is an exception for information that is newsworthy. Newsworthiness, or public interest, generally trumps privacy in the United States. This fact was recognized as early as 1890, by Samuel Warren and Louis Brandeis in their famous Harvard Law Review article, “The Right to Privacy.” The principle was further reinforced in 1940, when the U.S. Court of Appeals for the Second Circuit held that former child prodigy William James Sidis, who had made great efforts to become a private citizen again after having received extensive news coverage as a young boy, could not prevail in a privacy action against a magazine that featured him in a “Where Are They Now?” section. The court held that the public retained a legitimate interest in knowing whether Sidis had lived up to the intellectual promise of his youth (Sidis v. F-R Publishing Corp., 1940).

National Court Perspectives

Courts in various European nations have started to apply the Court of Justice’s opinion within the context of their own jurisdictions. For example, a trial court in the Netherlands rejected a claim by a convicted criminal who wanted Google to delist references to his having attempted to incite an assassination. The trial court stated that the Court of Justice opinion did not intend to prevent all negative online references to an individual, but only sought to prevent individuals from being hounded for long periods of time by irrelevant or excessive publications. Other Dutch opinions generally have favored the freedom of information and expression over delisting requests for information about contract disputes, among other issues.

An Israeli judge in Tel Aviv rejected a delisting request even though the information was conceded to be defamatory. Instead of focusing on the search engine, the judge suggested that the complainant take it up with the third party that originally published the content.

A judge in Japan ordered Google to remove some but not all search results linking a man to a 10-year-old crime he did not commit. Meanwhile, proponents of the right to be forgotten in South Korea cited the Court of Justice opinion in proposing a new law that would require search engines to remove links to articles that were found by a court or the Press Arbitration Commission to be subject to deletion or correction (Youm & Park, 2016, p. 273).

In the United States, the U.S. Court of Appeals for the Ninth Circuit stated dismissively, “[S]uch a ‘right to be forgotten,’ although recently affirmed by the Court of Justice for the European Union, is not recognized in the United States” (Garcia v. Google, Inc., 2015?).

Discussion of the Literature

The right to be forgotten is part of a larger global discussion about privacy and communications media, and many books and articles address various aspects of that discussion. Focusing specifically on the right to be forgotten, however, an early and influential book is Viktor Mayer-Schönberger’s Delete (2009). Mayer-Schönberger traces the causes and effects of what he calls the “demise of forgetting” due to ubiquitous communications technology. Mayer-Schönberger recommended allowing individuals to set expiration dates for files, photographs, blog entries, and other pieces of digital information they created and transmitted electronically. Although Mayer- Schönberger did not explicitly endorse the right to be forgotten as it has developed in Europe and elsewhere, his book is widely cited in associated literature.

In the United States, the scholar, lawyer, and media commentator Jeffrey Rosen focused attention on the right to be forgotten in a series of articles in legal journals and the popular media. For example, in a New York Times Magazine article in 2010, Rosen recounted the story of Stacy Snyder, a high school teacher trainee in Pennsylvania who was denied a university degree after she posted a photograph on the social media site MySpace that depicted her as a drunken pirate. Rosen (2012a) cited a blog post by Google’s chief privacy counsel, Peter Fleischer, in which Fleischer discussed a three-tiered system of decreasing control over information about oneself based on whether the information was (a) posted by the subject, (b) posted by the subject and then copied and distributed elsewhere by another person, or (c) created and posted by another person (Rosen, 2012b, p. 91). Rosen, among others, has argued that a right to be forgotten conflicts with the constitutional guarantee of freedom of expression.

Some U.S. scholars who decry the free speech problems posed by the right to be forgotten also call upon European jurists and policymakers to revise their understanding of privacy. The international nature of digital communication technology can mean that a pro-privacy regime in one nation that favors the right to be forgotten effectively sets the standard for other nations. This phenomenon has sparked scholarly criticism. Scholars also have noted that right to be forgotten regulation, particularly in the EU, restricts the collection and use of consumer data by advertisers. Another fear, expressed elegantly by a prominent Argentine legal scholar, Eduardo Bertoni, is that the right to be forgotten could allow government officials to cover up their misdeeds and alter the historical record. Bertoni and others in Argentina and Chile contend this could undermine the truth-telling aspects of transitional justice following human-rights abuses by military regimes (Bertoni, 2014).

Scholarly discussion also has focused on the rationale for the right to be forgotten. Among the potential bases for the right are privacy, intellectual property, reputation, data protection, and a right to enjoy the benefits of scientific progress and its applications. The reluctance to embrace the right to be forgotten in some quarters is related to the uncertainty regarding its origin and justification.

Although the scholarly debate over the right to privacy has been around for more than a century, at least since Warren and Brandeis wrote in the Harvard Law Review, academic research focus on the right to be forgotten is relatively recent and, thus, much remains to be done. A true assessment of the direction of academic research on this topic likely will become increasingly relevant and feasible over time.

Primary Sources

The primary sources on the right to be forgotten consist of legislative, regulatory, and judicial materials. The right is a relatively new concept and thus some of the primary source documents are still evolving. For example, the General Data Protection Regulation in Europe is finalized but will not take effect until 2018. The Data Protection Directive of 1995 is still important, though widely recognized to be out of date due to changes in technology, among others.

Judicial decisions in Argentina, among other places, are important but often difficult to locate and in need of translation. The general principles of privacy have long been established in both civil and common law jurisdictions through codes and judicial opinions, but these do not directly address the right to be forgotten in the context of digital media.

A number of online commercial databases are available to gain access to judicial opinions as well as certain legislative and regulatory materials. The websites of the European Union, the Court of Justice, the European Commission, and the European Parliament contain primary source documents on the right to be forgotten.


Bertoni, E. (2014). The right to be forgotten: An insult to Latin American history. Huffington Post.Find this resource:

Carter, E. L. (2013). Argentina’s right to be forgotten. Emory International Law Review, 27(1), 23–39.Find this resource:

Clark, L. (2014). Google’s “Right to Be Forgotten” response is “disappointingly clever”.

Dredge, S. (2014). Microsoft and Yahoo respond to European “Right to Be Forgotten” requests.

European Commission. (2016). Reform of EU data protection rules.

European Commission. (1995). AU: most prominent judic public. Directive 95/46/EC of the European Parliament and of the Council, article 2.

European Parliament. (2014). Legislative resolution of 12 March 2014 on the proposal for a regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).

European Union Committee. (2014). EU Data Protection Law: “A Right to Be Forgotten”?

Federal Trade Commission. (2014). Snapchat settles FTC charges that promises of disappearing messages were false.

Garcia v. Google, Inc., 786 F.3d 733, 745 (9th Cir. 2015).Find this resource:

Ghezzi, A., Pereira, Ȃ., & Vesnić-Alujević, L. (2014). The ethics of memory in a digital age: Interrogating the right to be forgotten. Basingstoke, U.K.: Palgrave Macmillan.Find this resource:

Google. (2015a). Transparency Report.

Google. (2015b). The Advisory Council to Google on the Right to be Forgotten. Final Report.

Gutwirth, S., Leenes, R., & de Hert, P. (2015). Reforming European data protection law. Dordrecht, Netherlands: Springer.Find this resource:

Harvard Law Review Association. (2014). Internet law—Protection of personal data—Court of Justice of the European Union creates presumption that Google must remove links to personal data upon request.—CASE C-131/12, GOOGLE SPAIN SL v. AGENCIA ESPAÑOLA DE PROTECCIÓN DE DATOS. Harvard Law Review 128, 735–742.Find this resource:

Keller, D. (2015). The final draft of Europe’s “Right to Be Forgotten” law. Stanford Law School Center for Internet and Society Blog.

Kirtley, J. (2015). “Misguided in principle and unworkable in practice”: It is time to discard the reporters committee doctrine of practical obscurity (and its evil twin, the right to be forgotten). Communication Law and Policy, 20(2), 91–115.Find this resource:

Lorenzo, A. (2014). Mario Costeja González: “Yo nunca he defendido el derecho al olvido en Internet.” La Voz de Galicia.

Mayer-Schönberger, V. (2009). Delete: The virtue of forgetting in the digital age. Princeton, NJ: Princeton University Press.Find this resource:

Mills, J. L. (2015). Privacy in the new media age. Gainesville: University Press of Florida.Find this resource:

Reding, V. (2012). The EU Data Protection Reform 2012: Making Europe the standard setter for modern data protection rules in the digital age.Find this resource:

Rosen, J. (2010, July 25). The web means the end of forgetting. The New York Times Magazine.Find this resource:

Rosen, J. (2011). Free speech, privacy, and the web that never forgets. Journal on Telecommunications & High Technology Law, 9, 345–356.Find this resource:

Rosen, J. (2012a). The deciders: The future of privacy and free speech in the age of Facebook and Google. Fordham Law Review, 80, 1525–1538.Find this resource:

Rosen, J. (2012b). The right to be forgotten. Stanford Law Review Online, 64, 88.Find this resource:

Sidis v. F-R Publishing Corp., 113 F.2d 806 (2d Cir. 1940).Find this resource:

Solove, D. J. (2008). The future of reputation: Gossip, rumor, and privacy on the Internet. New Haven, CT: Yale University Press.Find this resource:

Tippmann, S., & Powles, J. (2015). Google accidentally reveals data on “right to be forgotten” requests.

Warren, S. D., & Brandeis, L. D. (1890). The right to privacy. Harvard Law Review, 4(5), 193–220.Find this resource:

Youm, K. H., & Park, A. (2016). The “right to be forgotten” in European Union law: Data protection balanced with free speech? Journalism & Mass Communication Quarterly, 93(2), 273.Find this resource: