It’s been several months now since my first article on the general subject of data and information in the law. Normally, I could attribute the delay to typical publishing delays, an overbooked 2L year, and an overly inquisitive (read: easily distractible) mind. But, in this case, I am writing about perhaps the most visible aspect of the topic: the privacy of personal information in an increasingly connected society. Given the increasing focus on the topic in academia, government, and media, it has been difficult to keep up with all of the recent developments. Even the turn of the New Year – often a good time for a retrospective look – hasn’t slowed the pace. There have been some significant developments in the legal and regulatory world. A small selection of some of the most significant news includes:
• At the start of December, the Federal Trade Commission released a proposed framework of “Fair Information Practice Principles” for commercial entities that focuses on integrating privacy into every stage of design, simplifying consumer choice, and increasing the transparency of data practices. The FTC also announced the conclusion of an enforcement action against the advertiser EchoMetrix for failing to be clear to parents about the data it gathered about their children.
• United States v. Warshak, et al., --- F.3d ----, 2010 WL 5071766 (6th Cir. Dec. 14, 2010) found a reasonable expectation of privacy in email records, requiring a warrant under the Fourth Amendment to compel disclosure of communication stored at or transmitted through an ISP, regardless of how long it had been stored. The case directly addressed a user’s communications, but the expectation of privacy, as a principle, may extend to other types of identifying information.
• President Obama signed the Restore Online Shoppers’ Confidence Act into law December 29th. The bill aims to reduce potentially abusive unilateral aspects of online transactions. Specifically, it prohibits a vendor silently passing the customer’s transaction and personal information to a third party for fulfillment, and restricting negative-option marketing.
• The Equal Employment Opportunity Commission’s final rule implementing Title II of the Genetic Information Nondiscrimination Act went into effect January 10. The rule broadens and clarifies what types of information an employer must avoid seeking or storing about its employees.
These actions and programs generally represent fairly long research and discussion programs, which include a variety of stakeholders and address a variety of aspects of the online information ecosystem. For instance, the FTC focuses more on enforcement than does the Commerce Department, and so seeks to provide businesses with a framework – notably a proposed “Do Not Track” checkbox for users to opt-out of behavioral tracking.
There is a lot to highlight about these developments. First and foremost, a number of groups are at least attempting to focus on some of the central issues that we face going forward. For instance, it is easy to point out that user agreements and privacy policies tend to be overwhelmingly long and dense. But our ability to remedy this will be substantially strengthened by having now acknowledged that these statements are not only obscure legalese, but tend also to serve primarily to protect service providers from liability, not to protect or even notify users.
Along with statutory safe harbors to remove some of the technicalities, a national or international standard requiring notification of security breaches can spur proper precaution through market forces and solid business practice. Further, "Privacy by design" encourages agencies and companies alike to consider processes from the ground up, and signifies some recognition that we may need to start over in some places, rather than working incrementally in addressing the problems we face.
The Wall Street Journal, for instance, has added to its “What They Know,” series of articles since I wrote last. The series as a whole is a relatively thorough, if somewhat alarmist (and ironic considering the number of trackers on the WSJ site), attempt to document and explain the amount and types of information that advertisers, websites, mobile phone companies, and service providers have about the average web user. In addition to technology, the narrative includes the motivation and reasoning behind such extensive data collection – including business plans and profiles of some major and minor players in the industry. In its December installment the series turned its focus to some of the more complex “device fingerprinting” techniques, which do not require storing anything on the user’s device. A related blog post elaborates on how these technologies are beginning to be used outside the fraud investigation context in which they originated.
This illustrates two major public policy difficulties that we face in any attempt to regulate the collection of information. First, the fact that several of the leading companies in the field have spun off from anti-fraud and anti-piracy firms highlights the difficulties in distinguishing between desired and unwanted tracking. To the extent that privacy is held absolutely protected, companies may be unable to trace, or even discover, criminal and fraudulent activity. In fact, many will be unable to sustain their business models, either due to this liability or simply due to potential decreases in advertising revenue. Second, device fingerprinting is a prime example of technology developing much more quickly than regulatory schemes. For tracking “cookies,” specifically, technology that has not been directly related to marketing or privacy has been applied such that the entire “cookies” debate may be moot by the time the system even begins to successfully address it.
Distinguishing desired from unwanted information gathering causes problems for all parties, not just regulators and lawmakers. Users, even those who understand how to implement a given preference, face difficult decisions regarding how to use the internet, their phones, and even their regular appliances. While most people likely prefer to be tracked as little as possible, significant cost to use currently free services is a fairly significant weight to bear, and having personalized, relevant ads and content provided with no conscious effort really is a benefit. Younger generations, which are typically considered especially vulnerable, appear to be increasingly willing to share personal details with their most distant acquaintances as well as advertisers. People often have little concept on exactly what trade-offs they may be making, and the new government proposals largely address this by demanding clear and transparent notice of privacy practices. However, treating these proposals as major progress side-steps the major issue – I would argue that perhaps the most substantial policy problem lies in finding a way to partially convey the possible benefits and risks a consumer faces. It may be best summarized in an email Call for Papers I recently received for a special “Pervasive Intelligibility" workshop at the upcoming Ninth International Conference on Pervasive Computing, which will focus entirely on the need for technological systems to be “scrutable” to users – “to improve the usability of these novel, and possibly unintuitive, systems and to help users understand, appreciate, trust, and ultimately adopt them.” While there is still such difficulty in making counter-intuitive technology itself both secure and intelligible, “clear” legal judgment in its use will not be practical.
The service provider, too, faces a wide range of choices and uncertainties. Corporate directors may face a complicated due diligence burden when performing otherwise routine business activities, to assure that private data is not only kept secure, but is limited to precisely specified purposes. On the other hand, major regulations such as the Gramm-Leach-Bliley Act currently contain exceptions (at 15 U.S.C. 6821(d) or (e), for example) for financial or insurance institutions to bend some privacy rules in order to root out fraud and other crimes. How do such exceptions apply in cases like those discussed in the WSJ series, where the same party investigating fraud is also collecting or providing marketing information? Furthermore, many of the proposed frameworks include an exception for information collected for “legitimate business reasons.” One of the characteristics that distinguish the Information Age from earlier times is the ability to “mine” data that is already collected for unrelated purposes. At any time, though, political bodies might decide that a particular use of data is abusive, rather than astute business practice, thus destroying a potentially significant investment.
A regulatory system could easily err in either direction on several different levels. By being too rigid, regulations might fail to proscribe unwanted, but sufficiently creative, conduct. Or it might punish innocent, harmless business practices, which run afoul of a technicality – even one that didn’t exist when the data was first collected. On the other hand, basing liability too much on intent leads to problems in situations of rapid growth. Given the exponential growth of technological progress, few major innovations are perfectly foreseeable, and thus are not likely intentional in a legal sense. With the amount of information already collected in so many different forms, regulations cannot be restricted to data collection, which was originally intended to create a profile.
Overall, we face a difficult task in an unknown environment. We must simplify choices for users, but we must also be more detailed and transparent regarding business practices; we must take privacy into account throughout the innovation life cycle, but we must allow for ever-changing definitions of privacy as a result of that innovation. And we must continue to balance, as a society, our need for crime prevention with concerns for freedom and privacy in a faster, more connected, global society. Comments on the FTC and Commerce Department papers are now available, but the true complexity of the regulation of personal information remains to be seen.