Academic round-up on privacy and new technologies, including search engines

The End of the Net as We Know it? Deep Packet Inspection and Internet Governance
by Ralf Bendrath and Milton Mueller

Advances in network equipment now allow internet service providers to monitor the content of data packets in real-time and make decisions about how to handle them. If deployed widely this technology, known as deep packet inspection (DPI), has the potential to alter basic assumptions that have underpinned Internet governance to date. The paper explores the way Internet governance is responding to deep packet inspection and the political struggles around it. Avoiding the extremes of technological determinism and social constructivism, it integrates theoretical approaches from the sociology of technology and actor-centered institutionalism into a new framework for technology-aware policy analysis.

Keywords: Internet governance, Internet regulation, Deep Packet Inspection, Privacy, Surveillance, Censorship, Internet service providers, Actor-Centered Institutionalism, Disruptive technology, Socio-technical systems, Network Neutrality, Social Construction of Technology, Technological Determinism

The Legality of Deep Packet Inspection – Angela Daly
European University Institute – Department of Law (LAW)

Abstract:    

Deep packet inspection is a technology which enables the examination of the content of information packets being sent over the Internet. The Internet was originally set up using “end-to-end connectivity” as part of its design, allowing nodes of the network to send packets to all other nodes of the network, without requiring intermediate network elements to maintain status information about the transmission. In this way, the Internet was created as a “dumb” network, with “intelligent” devices (such as personal computers) at the end or “last mile” of the network. The dumb network does not interfere with an application’s operation, nor is it sensitive to the needs of an application, and as such it treats all information sent over it as (more or less) equal. Yet, deep packet inspection allows the examination of packets at places on the network which are not endpoints, In practice, this permits entities such as Internet service providers (ISPs) or governments to observe the content of the information being sent, and perhaps even manipulate it. Indeed, the existence and implementation of deep packet inspection may challenge profoundly the egalitarian and open character of the Internet.

This paper will firstly elaborate on what deep packet inspection is and how it works from a technological perspective, before going on to examine how it is being used in practice by governments and corporations. Legal problems have already been created by the use of deep packet inspection, which involve fundamental rights (especially of Internet users), such as freedom of expression and privacy, as well as more economic concerns, such as competition and copyright. These issues will be considered, and an assessment of the conformity of the use of deep packet inspection with law will be made. There will be a concentration on the use of deep packet inspection in European and North American jurisdictions, where it has already provoked debate, particularly in the context of discussions on net neutrality. This paper will also incorporate a more fundamental assessment of the values that are desirable for the Internet to respect and exhibit (such as openness, equality and neutrality), before concluding with the formulation of a legal and regulatory response to the use of this technology, in accordance with these values.

Keywords: deep packet inspection, net neutrality, US, EU, privacy, competition, free expression, privacy, copyright

The Boundaries of Privacy Harm – M. Ryan Calo
Stanford Law School

Abstract:    

Just as a burn is an injury caused by heat, so is privacy harm a unique injury with specific boundaries and characteristics. This Essay describes privacy harm as falling into two related categories. The subjective category of privacy harm is the unwanted perception of observation. This category describes unwelcome mental states – anxiety, embarrassment, fear – that stem from the belief that one is being watched or monitored. Examples include everything from a landlord listening in on his tenants to generalized government surveillance.

The objective category of privacy harm is the unanticipated or coerced use of information concerning a person against that person. These are negative, external actions justified by reference to personal information. Examples include identity theft, the leaking of classified information that reveals an undercover agent, and the use of a drunk-driving suspect’s blood as evidence against him.

The subjective and objective categories of privacy harm are distinct but related. Just as assault is the apprehension of battery, so is the unwanted perception of observation largely an apprehension of information-driven injury. The categories represent, respectively, the anticipation and consequence of a loss of control over personal information.

The approach offers several advantages. It uncouples privacy harm from privacy violations, demonstrating that no person need commit a privacy violation for privacy harm to occur (and vice versa). It creates a “limiting principle” capable of revealing when another value – autonomy or equality, for instance – is more directly at stake. It also creates a “rule of recognition” that permits the identification of a privacy harm when no other harm is apparent. Finally, the approach permits the sizing and redress of privacy harm in novel ways.

Privacy and Regulatory Innovation: Moving Beyond Voluntary Codes – Ira Rubinstein
Information Law Institute, NYU School of Law

NYU School of Law, Public Law Research Paper No. 10-16
I/S: A Journal of Law and Policy for the Information Society, Forthcoming Winter 2011

Abstract:    

According to its many critics, privacy self-regulation is a failure. It suffers from weak or incomplete realization of Fair Information Practice Principles, inadequate incentives to ensure wide scale industry participation, ineffective compliance and enforcement mechanisms, and an overall lack of transparency. Rather than attacking or defending self-regulation, this Article explores co-regulatory approaches in which government plays a role in setting requirements for industry guidelines and imposing sanctions for non-compliance. Based on three case studies of a weakly mandated industry code aimed at online behavioral advertising practices, a more strongly mandated program enabling data flows between Europe and the US, and a safe harbor program designed to protect children’s privacy, this Article argues that statutory safe harbors have many strengths but would benefit from being redesigned. Next it conceptualizes new models for privacy co-regulation based on insights derived from “second generation” environmental policy instruments such as environmental covenants. Finally, it offers specific recommendations – to the FTC, on how it might begin to use the covenanting approach to experiment with innovative technologies and address hard problems such as online behavioral advertising, and to Congress on how best to structure new safe harbor programs as an essential component of omnibus consumer privacy legislation. All of these approaches to regulatory innovation move beyond purely voluntary codes in favor of co-regulatory solutions.

The Mandatory Registration of SIM Cards – Ewan Sutherland
LINK Centre, University of Witwatersrand; CRID, University of Namur

March 8, 2010

Computer and Telecommunications Law Review, pp. 61-63, 2010

Abstract:    

To support investigations into a range of crimes, governments and regulators are requiring mobile operators to register all pre-paid SIM-cards by collecting personal details to be made available in a central database to police and security services. Unregistered SIM-cards are being barred. Brief descriptions are provided of Botswana, Jordan, Kenya, Nigeria, Pakistan, Singapore, Tanzania and Vietnam.

Trawling DNA Databases for Partial Matches: What is the FBI Afraid of? – David H. Kaye
The Pennsylvania State University Dickinson School of Law

Cornell Journal of Law and Public Policy, Vol. 19, No. 1, 2009
Penn State Legal Studies Research Paper No. 8-2010

Abstract:    

DNA evidence is often presented as the “gold standard” for forensic science. But this was not always the case. For years, eminent scientists complained that the estimates of the tiny frequencies of DNA types were unfounded. It took scores of research papers, dozens of judicial opinions, and two committees of the National Academy of Sciences to resolve the dispute by the mid-1990s. Since 2000, however, reports have surfaced of shocking numbers of “partial matches” among samples within large DNA databases, and some scientists have complained that the infinitesimal figures used in court to estimate the probability of a random match are no better than alchemy. To study the partial-match phenomenon further, defendants have sought to discover all the DNA records (with personal identifiers removed) kept in offender databases. The FBI has responded by branding the proposed research as useless and the release of the data as an illegal invasion of privacy. The media have reacted by calling for congressional hearings and, possibly, criminal charges against FBI officials.

This Article reviews the existing research findings and considers the scientific, legal, and ethical objections to disclosure of the DNA data. It concludes that the arguments against further research are unpersuasive. At the same time, it finds that the claims of dramatic departures from the expected numbers of partial matches are exaggerated and predicts that new research will not reveal unknown flaws in the procedure for estimating the chance of a match to an unrelated individual. In view of the importance of DNA evidence to the criminal justice system, this Article recommends using the databases for more statistical research than has been undertaken so far. It also calls for dissemination of the anonymized records for this purpose.

Keywords: DNA evidence, probability, population genetics, DNA databases, birthday paradox, partial match

Privacy in Search Engines: Negotiating Control – Federica Casarosa
Robert Schuman Centre fo Advanced Studies

Abstract:    

Internet and generally modern communication technologies have radically modified current society, bringing in new risks for citizens’ privacy. The tangible effects of such technological progress have been, on the one hand, the improvements of tools for retrieval and collection of data, and, on the other hand, the increased capability of storage and aggregation of collected information. This can be interpreted positively in terms of much greater and better opportunities for the development of personality, making available information previously inaccessible (due to high cost or efforts needed to access). However, the same technical tools can be used to achieve the opposite result: prevent the expression of users’ personality through a continuous, though imperceptible, control that could shift the interpretation of user profiles from a pre-judgment into a prejudice.

From a legal point of view, different solutions have been put forward, descending from different approaches. On the one hand, we can observe the case of self-regulation, where technology itself can help to limit the aforementioned risks for personal data; on the other hand, we can take the example of legislative harmonization implemented by the Member States in the EU, where the monitoring activity is carried out by independent authorities, the so called Data protection Authorities.

A recent example that can show the market dynamics and the legal reactions concerning data protection, is the search engine Google, which received a brief but significant letter from the Art 29 Working Party (hereinafter Art 29 WP)11 due to the low level of protection assured by the Mountain-view society in the delivery of its services. The intervention, though not binding, has been the first step for Google in the direction of an improvement of its data protection policy, so as to achieve the level required by European legislation.

Keywords: privacy, search engines, internet, private regulation

Search Query Privacy: The Problem of Anonymization – Ron A. Dolin

Hastings Science and Technology Law Journal, Vol. 2, No. 2, p. 137, Summer 2010

Abstract:    

Search queries may reveal quite sensitive information about the querier. Even though many queries are not directly associated with a particular person, it has been argued that the IP addresses and cookies of the users can often be sufficient to figure out who the querier is, especially if tied to information from ISPs regarding IP address assignments at the time of the relevant query. Given that the queries have been subject to discovery both by various governments and third parties, there has been great concern for how to keep such queries private. A typical approach to such privacy legislation, especially in Europe, has been to require either destruction of the data so that it is no longer available for discovery, or anonymization so that it cannot be associated with a particular person. This solution has never been proposed for personal data such as medical information used by doctors or financial information used by credit agencies. Instead, there seems to be an assumption about these types of data that their long-term storage is necessary and/or beneficial to the individual associated with them, or at least to society at large. The framework for maintaining the privacy of these data turns on safeguards where it is being held, user control of its retention and accuracy, and strict legal limitations regarding its discovery. This article briefly reviews a few legal frameworks for data protection both in the U.S. and in Europe. It presents several arguments that the deletion or anonymization of search query data is problematic, and describes a framework similar to the way we handle health data that is more beneficial to all stakeholders. Such an approach would lead to a more uniform solution to data protection in which maintaining search query privacy would not sacrifice the benefits of long term, confidential storage of the data.

Privacy Revisited – GPS Tracking as Search and Seizure – Bennett L. Gershman
Pace University – School of Law

Abstract:    

Part I of this Article discusses the facts in People v. Weaver, the majority and dissenting opinions in the Appellate Division, Third Department, and the majority and dissenting opinions in the Court of Appeals. Part II addresses the question that has yet to be decided by the U.S. Supreme Court – whether GPS tracking of a vehicle by law enforcement constitutes a search under the Fourth Amendment. Part III addresses the separate question that the Court of Appeals did not address – whether the surreptitious attachment of a GPS device to a vehicle constitutes a seizure under the Fourth Amendment. The Article concludes that law enforcement’s use of a GPS device to track the movements of a vehicle continuously for an extended period of time is a serious intrusion into a motorist’s reasonable expectation of privacy that constitutes a search under the Fourth Amendment. Moreover, although the issue is somewhat murkier, the attachment of the GPS to a vehicle may constitute a seizure under the Fourth Amendment.

Biometrics, Retinal Scanning, and the Right to Privacy in the 21st Century – Stephen Hoffman
University of Minnesota – Twin Cities – School of Law – Syracuse Science and Technology Law Reporter, 2010

Abstract:    

Biometric identification techniques such as retinal scanning and fingerprinting have now become commonplace, but near-future improvements on these methods present troubling issues for personal privacy. For example, retinal scanning can be used to diagnose certain medical conditions, even ones for which the patient has no symptoms or has any other way of detecting the problem. If a health insurance company scans the retinas of potential clients before they purchase coverage, they could be charged higher premiums for conditions that do not present any issues. Not only is this unfair, but the ease with which these scans can be conducted – including scanning without the subject’s consent or knowledge – present disturbing privacy concerns and suggest an Orwellian future, controlled by Big Business rather than Big Brother.

Keywords: biometrics, biometric identification, retinal scanning

Othello Error: Facial Profiling, Privacy and the Suppression of Dissent – Lenese C. Herbert

Ohio State Journal of Criminal Law, Vol. 5, pp. 79-129, 2007

Abstract:    

In this article, Professor Herbert challenges the US. Transportation Security Administration’s post-September 11, 2001, use of Paul Elanan and Wallace Friesen’s Facial Action Coding System (FACS) to identify potential terrorists in American airports. Professor Herbert asserts that invasive visual examination of travelers’ faces and facial expressions for law enforcement purposes under the auspices of protective administrative searches ineffectively protects national and airport security and violates reasonable expectations of privacy. FACS improperly provides unreasonable governmental activity with a legitimizing scientific imprimatur that conceals governmental agents’ race- and ethnicity-based prejudices, which leads to targeting minorities’ faces as portents of danger. Professor Herbert assesses the concept offacial privacy in public, and in doing so, rejects the Supreme Court’s Katz v. United States test and argues in support of constitutional protection of public privacy.

Privacy by Deletion: The Need for a Global Data Deletion Principle – Benjamin J. Keele

Indiana Journal of Global Legal Studies, Vol. 16, No. 1, pp. 363-384
Abstract:    

With global personal information flows increasing, efforts have been made to develop principles to standardize data protection regulations. However, no set of principles has yet achieved universal adoption. This note proposes a principle mandating that personal data be securely destroyed when it is no longer necessary for the purpose for which it was collected. Including a data deletion principle in future data protection standards will increase respect for individual autonomy and decrease the risk of abuse of personal data. Though data deletion is already practiced by many data controllers, including it in legal data protection mandates will further the goal of establishing an effective global data protection regime.

A Paradigm Shift in Electronic Surveillance Law – Mark Klamberg
Nordic Yearbook of Law and Information Technology, 2010

Abstract:    

Electronic surveillance law is subject to a paradigm shift where traditional principles are reconsidered and the notion of privacy has to be reconstructed. This paradigm shift is the result of four major changes in our society with regard to 1) technology; 2) perceptions of threats, 3) interpretation of human rights and 4) ownership over telecommunications. The above-mentioned changes have created a need to reform both the tools of electronic surveillance and domestic legislation. Surveillance that was previously kept secret is now subject to public debate. The article focuses on systems of “mass surveillance” such as data retention and signal intelligence and whether these are consistent with the European Convention on Human Rights.

Keywords: Electronic Surveillance, Privacy, Signal Intelligence

New Challenges to Data Protection Study – Country Report: United States – Chris Jay Hoofnagle

European Commission Directorate-General Justice, Freedom and Security Report, May 2010

Abstract:    

This report is one of 11 country reports produced for the “New Challenges to Data Protection” study, commissioned by the European Commission, and describes the ways in which US law addresses the challenges posed by the new social-technical-political environment.

The hallmark of the US federal approach to privacy is sectoral regulation. A panoply of statutes now regulates specific types of government and business practices, with no broadly-applicable privacy statute governing data collection, use, or disclosure. The Federal Trade Commission has encouraged self-regulation in a number of sectors, and the development of privacy-enhancing technologies. The US approach to privacy is incoherent, sectorally-based, and largely driven by outrage at particular, narrow practices. Still, several innovations from the US approach deserve attention internationally.

First, increasingly, privacy statutes create evolving standards of care, thus encouraging innovation for handling of data and avoiding the reification that can result from prescriptive, detailed regulation. For instance, the Fair Credit Reporting Act mandates an evolving “maximum possible accuracy” standard.

Second, in the direct marketing context, the US has imposed advertiser liability for violations of telemarketing, fax, and spam laws. This is a promising approach to address the use of difficult-to-identify and prosecute service providers that are responsible for illegal marketing campaigns.

Third, audit requirements for access to personal information has had a profound effect in encouraging industry and citizen policing of privacy violations. Audit logs have substantiated long-suspected privacy problems regarding “browsing” of files, and news media access to celebrities’ medical records.

Fourth, the US has briefly experimented with “data provenance,” a requirement that buyers of personal information exercise diligence to ensure against misuse of data. Data provenance responsibilities can create incentives to reduce gray and black market sales of personal information.

Finally, most federal privacy law acts as a floor of protections, allowing states to enact stronger rules. This has created a tension between state and federal governments, resulting in a leveling up of protections, because states (which tend to be more activist on privacy issues) can act where the US Congress is occupied with other issues.

Internet Jurisdiction and Data Protection Law: An International Legal Analysis – Christopher Kuner

International Journal of Law and Information Technology, 2010

Abstract:    

Data protection law has been the subject of an increasing number of jurisdictional disputes, which have largely been driven by the ubiquity of the Internet, the interconnectedness of the global economy, and the growth of data protection law around the world in recent years. There are also an increasing number of instances where data protection law conflicts with legal obligations in other areas. Moreover, the rapid development of new computing techniques (such as so-called ‘cloud computing’) is putting even greater pressure on traditional jurisdictional theories. Jurisdictional uncertainties about data protection law have important implications, since they may dissuade individuals and companies from engaging in electronic commerce, can prove unsettling for individuals whose personal data are processed, and impose burdens on regulators. These difficulties are increased by the fact that, so far, there is no binding legal instrument of global application covering either jurisdiction on the Internet or data protection. This article examines international jurisdiction as it relates to data protection law, and specifically to instances in which jurisdiction under data protection law may be considered ‘exorbitant’, with a particular focus on rules of public international law.

Developing an Adequate Legal Framework for International Data Transfers – Christopher Kuner

REINVENTING DATA PROTECTION? S. Gutwirth, eds., pp. 263-273, Springer Science+Business Media B.V., 2009

Abstract:    

With the EU Data Protection Directive having been in force now for nearly ten years, it is wise to examine the basic concepts and assumptions on which the Directive is based, to determine whether it is functioning properly. It is the thesis of this paper that the present EU legal framework for “adequacy” decisions for the international transfer of personal data is inadequate, in both a procedural and substantive sense, and needs reform. The framework was created for a world in which the Internet was not widely used, and in which data did not flow as easily across national borders as they do now. The present system of adequacy decisions has been grievously overloaded by the great increase in data flows in the past few years, and also drains resources that could better be used in other areas of data protection. European policymakers should take a hard look at the current adequacy system and its present failings, and reform the system in a way that more effectively protects the interests of data controllers, individuals, and data protection supervisory authorities.

Keywords: European Union, data protection, privacy, adequacy, international data transfers, global data flows, APEC, accountability

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: