The one-sentence summary
Google will have to remove some search results upon request because the European Union's highest court believes that old information about a person can be not only irrelevant but misleading.
What did the court say?
The ECJ ruled that a search engine like Google has a responsibility to delete links concerning personal information upon request as long as that information is not relevant or in the public interest.
The search engine must remove the links even if the original source of the information, in this case the newspaper, doesn't have the obligation to do the same.
Why this distinction between newspapers and search engines? Because search engines make it easy to find information and are responsible when that information is outdated or inaccurate, the court said.
"Without the search engine," reads the summary of the court's full decision, "the information could not have been interconnected or could have been only with great difficulty."
The case was based on a complaint by Mario Costeja González, who argued that an auction note on his repossessed home damaged his reputation. The Spanish newspaper La Vanguardia published that information in 1998.
So what is this "right to be forgotten"?
The right to be forgotten is a legal concept in Europe that maintains that a person has a right to leave his or her past behind. The concept originated from the French "droit à l’oubli" (right of oblivion), which allows a convicted criminal to avoid having the details of a conviction and incarceration published after serving prison time.
Applied to the Internet, this concept would give individuals the right to ask Internet companies to erase their personal data when it is outdated, inaccurate or simply not relevant anymore, if there is no public interest in preserving the data.
"That does not mean that everybody has the right to be forgotten whenever they'd like to have their data forgotten,"Douwe Korff, a law professor at the London Metropolitan University who specializes in data privacy, tells Mashable. "The right to be forgotten, it's really a balance. It doesn't say you have the right to have your transgressions forgotten. It says if there is no public interests in those transgressions being exposed, then they shouldn't be exposed."
The right to be forgotten is not regulated in detail in European law, although the EU is working on a comprehensive new law that would regulate it explicitly. Today's decision, however, is based on a 1995 directive that says the EU's members have an obligation to make sure personal data is accurate and up to date. If there's "inaccurate or incomplete" data, countries have to provide citizens with a way to have that data "erased or rectified."
The directive doesn't specifically refer to Internet data, but applies to all kinds of data. The ECJ ruled today that the obligation to erase or rectify inaccurate or outdated information amounts to a right to be forgotten.
Why are critics wary about this ruling?
That might all sound reasonable, as we all have parts of our past we'd like the Internet to forget, but there's a problem here: Who decides what information can be deleted and what information should be preserved? Where's the limit?
The ECJ ruled that a person who wants some of his or her personal data erased can demand so from a search engine, which in turn will have to decide if the request is reasonable, balancing the person's privacy rights with freedom of expression and the public interest. And that gets tricky.
"That's not a very easy judgment to make," says Jens-Henrik Jeppesen, the director of European Affairs for the Center for Democracy & Technology, a digital rights organization. "There is all the risk in the world that this will be used by all kinds of people to demand that various information that they find inconvenient should be taken down."
The risk, Jeppesen says, is that such requests might lead Internet companies to be overly conservative and accept most of the demands to avoid litigation.
In other words, the risk is pre-emptive self-censorship. And the result of that might be a situation in which the same search term gives completely different results in Europe and the United States.
"It almost sounds like China," Jeppesen says. "You get one sets of results if you search from China and another if you search from outside."
But not everyone is alarmed. Viktor Mayer-Schönberger, a professor of Internet governance and regulation at the Oxford Internet Institute, wrote in an email that concerns about censorship are "BS," since this is about information already out there and published many years prior.
Furthermore, he adds, "practical implications will be limited because it still requires individuals to make claims and vigorously pursue them not shying away from spending time and money. Only few will do so."
But the ruling opens up many other questions.