In 2010, a Spaniard by the name of Mario Costeja González searched for his name on Google and discovered a link to a Catalonian newspaper's website containing an announcement about a real-estate auction from 1998 held to pay off some of his outstanding debts. In response, he filed a complaint with the Spanish Data Protection Agency (AEPD) against both the newspaper and Google. González asked that the newspaper be ordered to delete the information from its site and Google be required to remove the link from its index so that these pages would not turn up in a search result. Notably, González did not dispute the accuracy of the information; he only argued that it was about an issue that was resolved and now entirely irrelevant.
The AEPD rightfully dismissed the claim against the newspaper but then shockingly sided with González in his dispute with Google. The government reasoned that as a member of the media the newspaper acted lawfully in publishing truthful information, but that as a "data processor" (as defined by European law) Google had a greater obligation to protect an individual's right to privacy. Google appealed the case, and last week the European Court of Justice (ECJ) upheld the ruling, thereby firmly establishing that individuals have the right to have information about themselves removed from a search engine if they do not think it is relevant.
Through this ruling, the ECJ has rejected the compelling logic of the maxim "you are entitled to your own opinions, but not your own facts." Instead, the ECJ has ruled that truthful, lawfully published information (i.e. facts) should be hidden from society based on the whims of the individual. For those disturbed by the idea of "filter bubbles," this legally mandated form of digital ignorance raises the stakes to a whole new level. While some may claim this is not censorship since the information is still available in the hidden corners of the Internet, that would be like arguing that a newspaper can print whatever it wants, so long as it does not tell its readers where to find a copy. After all, search engines are the tools that allow users to make sense of the vast quantity of information on the Internet and find what they are looking for. Without the ability to organize all this information, it is just useless noise.
In addition, the ECJ has single-handedly dismantled society's "right to know" and replaced it with an individual's "right to be forgotten." Coming out of a region of the world with a vested interest in the mantra "never forget," this is particularly disheartening. Moreover, this ruling reflects the tendency among privacy advocates to claim that individual rights trump collective benefits and demonstrates the harmful results of pursuing the type of "privacy-at-any-cost" policies that reign supreme in Europe and have slowly penetrated thinking in the United States. After all, who are the winners and losers here? It is the convicted sex offenders, disgraced politicians and board-sanctioned doctors of the world who stand to benefit the most as their past indiscretions are scrubbed from society's collective memory. Average citizens, on the other hand, must stand by and watch their access to publicly available information greatly diminished.
It is unfortunate that privacy laws have degenerated from upholding the "right to be left alone" to an overbearing attempt at obscuring reality. And where will this end? If individuals have the right to erase public data about themselves, why stop with search engines? Did someone say something true about you on Facebook or Twitter? Time to file a complaint. Did you write something you regret in an email? Just require the email provider to track down and delete all copies of your message. You will never again need to worry about learning from your mistakes since you can just forget them.
Rather than trying to obscure the public record, a better approach for privacy regulators is to identify specific harms that individuals may face and craft targeted policies to mitigate those harms. For example, in the United States, the Fair Credit Reporting Act generally requires black marks on an individual's credit report, such as late payments and foreclosures, to be removed after seven years. Similarly, many states put time limits on how long certain public information can be used for setting insurance rates. California, for example, stipulates that DUI offenses count against good driver auto insurance discounts for 10 years.
The European Union is in the midst of updating its privacy laws, so this ruling will certainly not be the last word on the subject. But as policymakers both in the United States and abroad continue to refine privacy laws and regulations in the coming years, they should consider who exactly it is they are trying to protect. In this case, it is hard to see how rules designed to protect people like Donald Sterling, Anthony Weiner and Mel Gibson serve the common good. Since privacy laws almost always involve a trade-off between different values, policymakers should be aware what they are giving up when they make these decisions and strive to find a more balanced approach.
Castro is a senior policy analyst with the Information Technology and Innovation Foundation.