Fear, Uncertainty, Doubt and Google Corporation

FUDGoogleIn recent months more and more attention has been directed towards Google’s data retention policies. In May of 2007 Peter Fleishcher of Google’s global privacy counsel established three key reasons for why his company had to maintain search records:

  1. To improve their services. Specifically, he writes “Search companies like Google are constantly trying to improve the quality of their search services. Analyzing logs data is an important tool to help our engineers refine search quality and build helpful new services . . . The ability of a search company to continue to improve its services is essential, and represents a normal and expected use of such data.”
  2. To maintain security and prevent fraud and abuse. “Data protection laws around the world require Internet companies to maintain adequate security measures to protect the personal data of their users. Immediate deletion of IP addresses from our logs would make our systems more vulnerable to security attacks, putting the personal data of our users at greater risk. Historical logs information can also be a useful tool to help us detect and prevent phishing, scripting attacks, and spam, including query click spam and ads click spam.”
  3. To comply with legal obligations to retrieve data. “Search companies like Google are also subject to laws that sometimes conflict with data protection regulations, like data retention for law enforcement purposes.” (Source)


Since posting on this topic Fleischer’s positions have come under fire – why can’t Google use anonymized data to determine the accuracy of search results? Why must events that limit corporate profits be included as a reason to override personal data privacy? Why is Google subjecting themselves to laws that are not yet in place, and will not be applied retroactively?

Google has since attempted to clarify their position in their public policy blog by reflecting on the balance between privacy (they are specifically referring to digital privacy, though they don’t explicitly distinguish between digital and analogue privacy) and security. The corporate blog is particularly concerned with the European Union’s data retention directive, which imposes retention obligations to accessible data that is processed or generated using communication services. Fleishcher indulges in a bit of Euro-bashing in his analysis of the data retention directive that may appeal to some people but that ultimately displays his insensitivity to the complicated process of EU politics.

He begins by calling the directive, which calls for harmonization of data retention policies by the member states, an oxymoronic document on the basis that any harmonization between member-states must remain sensitive to the plurality of value structures embedded throughout the EU’s members. He notes that;

On a practical level, the likelihood of seeing a consistent implementation of the rules across the EU is effectively zero. The timing of the implementation – due by September 15, 2007 – will certainly vary. 16 of the 27 EU Member States have already declared that they will delay the implementation of data retention of Internet traffic data for an additional period of 18 months, as permitted by the directive. The compulsory retention period for each type of data will also vary from country to county (e.g. Germany has proposed 6 months, the UK 12 months, and the Netherlands 18 months). The interpretation of other key elements, such as “serious crime,” “competent national authorities,” or “electronic communications services” will be different across jurisdictions too. (Source)

The EU is a newfound political body – nothing of its scale and competencies have been successfully established since Rome and, with the dramatic intensification of global processes and demands for respecting cultural, legal, linguistic, and religious dignities (to name a few), the EU must meet challenges that never faced Rome. Each of the member nations have long-seated legal traditions that are born from their unique constitutions. Constitutions establish the basic law of nations. These basic laws were created in light of particular citizenry’s values and needs. As such, when reflecting on the actions of a supranational it is important to realize that it must operate in a fashion that does not intentionally or unintentionally discriminate against its members and their constitutional traditions. The EU itself lacks a fully legitimized constitution, leaving Europe without a legal citizenship – until these factors change all legal proceedings and directives must proceed cautiously to respect the dignity of its various members.

Fleischer continues to raise a series of rhetorical and hypothetical questions that are intended to cause the reader to amusedly snort at the ineptitude and disarray of the EU’s position. He writes,

Is a country more democratic than its neighbour because of its shorter retention period? Or do the citizens of that country face a greater security risk for the same reason? If there is something about the data retention directive that can be called into question is its proportionality – not necessarily in terms of financial cost to service providers, but in terms of privacy and anonymity loss. And what will Internet companies do in practice, especially if they operate one data architecture that cannot vary from one country to another: apply the longest retention period, or the shortest, or some “average”? (Source)

Rather than just breezing past these statements and questions, let’s examine them. Is a country more or less democratic based on its retention periods? This is supposing that democracy is wholly based on anonymity, which is not the case. Democratic environments persist because individuals experience a reasonable expectation of privacy, which allows them a space to informally communicate and associate with members of society without fearing illegitimate governmental oversight and supervision. Retention periods alone do not identify whether or not individuals’ privacy is being invaded; it is retaining and analyzing archived data without clear purposes or citizen consent that threatens free speech in democratic states.

Fleischer follows on this threat to democracy by asking whether democracies might be undermined if different terrorist threat valuations in different nations lead to divergent heterogeneous retention policies. While I can appreciate that this is a possible side-effect, by clearly asserting national laws that accord with the EU data retention policy this particular ‘threat’ can be avoided. Moreover, in the case where citizens feels that their national retention policy violates their basic rights they can appeal to their national court system to try and overturn the law and if that fails they can turn to the European Court of Justice. While it’s ‘nice’ for Fleischer to look out for EU member nations, they do have a reasonably sophisticated system for dealing with inconsistencies between EU and member-nation laws.

His last comment is perhaps the most confusing. Google, Yahoo!, and other major search companies routinely provide data to some localities and not to others based on geographic filtering technologies. I fail to see why an ISP cannot similarly filter their retention policies, only making retained French information available to French authorities, retained German information available to the Germans, et cetera. Perhaps I’m missing a key element of this argument, but it certainly isn’t a simple problem that can be tossed out in a single sentence. For it to hold weight more time and consideration must be given to it.

I can certainly appreciate the risks that are involved in retaining and distributing digitized data – the potential for harms for democratic participation motivate my thesis project that is entitled Technology, Communication, and Western Pluralistic Democracies: Realigning Digital Privacy to Facilitate Citizen-Solidarity. Unfortunately it appears as though Google’s defense of privacy is a thinly concealed attempt to justify their own retention policies, and not to provide a genuine defense of constitutional rights or the value of privacy to democratic nations.

Christopher Parsons

I’m a Postdoctoral Fellow at the Citizen Lab in the Munk School of Global Affairs at the University of Toronto and a Principal at Block G Privacy and Security Consulting. My research interests focus on how privacy (particularly informational privacy, expressive privacy and accessibility privacy) is affected by digitally mediated surveillance and the normative implications that such surveillance has in (and on) contemporary Western political systems. I’m currently attending to a particular set of technologies that facilitate digitally mediated surveillance, including Deep Packet Inspection (DPI), behavioral advertising, and mobile device security. I try to think through how these technologies influence citizens in their decisions to openly express themselves or to engage in self-censoring behavior on a regular basis.