28-Dec-00

 

 

 

Privacy Protection and

the Interdependence of Law, Technology and Self-Regulation

 

Joel R. Reidenberg[1]

 

 

 

 

Introduction

I.          Distinct Regulatory Models of Data Protection

II.          The Inadequacy of the Distinct Models

III.         The Privacy Interdependence Model

Conclusions

 

 

Introduction

 

            A new paradigm has emerged for the effective protection of personal information in the online environment of the Internet and the Information Society.  While data protection laws have spread to a significant number of countries around the world during the last twenty years, the divergence in national laws and the proliferation of transborder data processing challenge the enforcement of existing legal standards.  At the same time, technical capabilities have developed that both enable and constrain the ability of law to assure the fair treatment of personal information.  In effect, legal regulation shares rule-making authority with technological standards and protocols.  For the treatment of personal information, the most direct regulation of information processing comes from the technological rules built into network infrastructures by industry rather than from law itself.  Indeed, the architecture of information networks establishes default rules for information processing. 

 

This paper, thus, explores the complex interdependence among law, technology and industry practice.   Drawing on the American and European experiences in data protection, the paper proposes that, for the Internet, law must provide an incentive for technological developments that advance privacy-protective technologies.  The paper argues that law must further create the conditions that promote the deployment of privacy-protective technologies and system designs by industry. In a democratic society, rule-making through technology must be shaped by public policy goals and public debate.  Law is, thus, necessary to establish the public policy objectives, but insufficient to assure the implementation of fair information practices.

 

I.          Distinct Regulatory Models of Data Protection

 

The rules for data protection come from three distinct perspectives: political, economic, and technological.  In Europe, data protection is an inherently political right and focuses on legal mechanisms to guarantee respect for a fundamental human right to privacy.[2]  By contrast, in the United States, information privacy is left to the marketplace and the desire to have market-based protections for consumers.[3]  Across these two policy models of data protection, technological rules and defaults define information practices for network interactions.[4]

            In Europe, the political perspective on data protection insists that citizens have a fundamental human right to the fair treatment of their personal information.  This right of ‘informational self-determination’ is an integral component of democratic society.  Information self-determination emphasizes the associational rights of citizens and defines a basic right of the citizen to control the collection and use of personal information.  The political rights model seeks comprehensive legal rules through data protection legislation.  As a result, modern European data protection laws impose a complete set of standards for the fair treatment of personal information ranging from finality to access and enforcement.   Although the specific terms and interpretations of these laws may vary, the underlying principles share the common view that data protection is a basic human right that must be guaranteed by the state.

An opposite approach in the United States adopts an economic balancing instead of the political basis for information privacy.  The American approach views the state more skeptically and prefers to let citizens fend for themselves. Under the economic approach, self-regulation largely determines information privacy.  Industry codes of conduct and corporate practices are favored over law.  Data protection becomes a question of economic power rather than political right.  Indeed, the debate is typically characterized in terms of “consumers” rather than “citizens.”   In this approach, law only intervenes on a narrowly targeted basis to solve specific issues where the marketplace is perceived to have failed.  Ad hoc sectoral statutes, thus, address only an eclectic set of problems.   Drug abusers, for example, have stronger protection than web users and video rental titles must be held confidential,  though medical records can be disclosed.[5]

            Independent of these two models of data privacy, the Lex Informatica or “code” approach regulates through the technical rules embedded in network architecture.[6]  The technical standards and protocols as well as the default settings chosen by system developers set threshold information privacy rules.   These technical rules define the capabilities of networks such as the Internet to invade or protect privacy.  For example, anonymous Internet use may be built into the network structure just as surveillance tracking may also be built into the network. 

            Historically, the three models (political, economic and technical) have each sought to segment the regulation of fair information practices.  The political perspective insisted on law as the principal mechanism to assure data protection, while the economic perspective insisted on the market place as the arbiter of privacy protections.  At the same time, the technical approach has built rules directly into the transmission of data.  These different approaches are typically viewed as either self-sufficient or as substitutes for one another.  For example, the transatlantic dialogue for many years has described comprehensive law and political rights as the alternative to industry code and market decisions.[7]  At the same time, the technical community has pursued its own standardization processes and purported to embrace a certain degree of policy-neutrality.[8]   Yet, these different models are neither self-sufficient nor complete alternatives to one another.

 

II.        The Inadequacy of Distinct Regulation in the Information Society

 

Each of the distinct forms of regulation embody inherent limitations that preclude adequacy for effective protection of privacy.   Lex Informatica can build the capability either for privacy-protective or privacy-invasive infrastructures.  However, standing alone, the technical approach does not assure that deployment will respect fair information practices.  The U.S. economic marketplace model minimizes or leaves aside important aspects of information privacy such as  non-market democratic values, while the European comprehensive legislative model faces significant context specific problems.   At the same time, information privacy faces critical international dimensions that the political, market and technical models do not singularly resolve.

The Lex Informatica model suffers from the absence of a representative public policy debate and from the commercial pressure toward technical structures that maximize data collection and dataveillance.[9]  Several key examples reflect this current weakness.  The privatization of the domain naming system by the US government to Internet Corporation for Assigned Names and Numbers (“ICANN”) largely ignored the privacy considerations inherent in the design of the new domain name registration protocol.[10]   Indeed, the registration protocol and process require the online publication of information about registrants that implicate basic data protection principles.  The system design precluded the option of anonymous domain name registration.  Similarly, the Internet Engineering Task Force (“IETF”) is hard at work designing a new Internet transmission protocol, IPv6.[11]    This protocol contemplates that every device connected to the Internet will have a unique identifier—a type of digital fingerprint for Internet users. From an engineering standpoint, there may be important advantages to digital fingerprints, but from the privacy perspective, such an architecture is deeply troubling.  Significantly, these decisions are being made by the community of interested engineers at the IETF[12] rather than by a combination of engineers and policy makers.

While technical architecture decisions are often made in esoteric fora, major products are also frequently developed in a policy-myopic fashion.  Commercial pressures push developers and implementers toward products that collect as much information about users as is possible.  One-to-one market customization and data security imperatives each seek detailed information about individuals and their network interactions.  Typically, these “data creep” functions are either non-transparent to the user or incomprehensible.[13]  In effect, these technical decisions hide important policy issues for privacy.  For example, system servers routinely maintain log files containing traffic data on user behavior.  These files are valuable for system maintenance, but also enable massive tracking of individuals.  Yet, the important policy decisions about whether log files will be maintained anonymously or whether they will be deleted promptly are usually hidden from public scrutiny.[14]   Likewise, search engines are powerful tools for users to find information on the Internet.  However, they also provide striking surveillance capabilities.  DejaNews and Hotbot apparently configured the search engine to relay search string information along to third parties.[15]  Other popular software contained hidden features that enabled user tracking to a surprising degree.  RealNetworks even built a ‘phone home’ feature into its streaming audio player.[16]  Each of these examples illustrates the power that private organizations have to establish information privacy rules for individuals and an inevitable weighting of commercial interests over general public concerns.

            The U.S. model has a parallel set of limitations.  The reliance on self-regulation to let the market determine the protection of privacy minimizes the non-economic implications of data protection.[17]  Specifically, privacy is a central element of democratic governance and is a very humanistic value.[18]   Basic elements of democracy and human dignity lend themselves poorly to an economic marketplace.  Even beyond this inherent limitation, a citizen’s ability to act in a privacy market will be limited by an important network effect.  Any citizen may lose the ability to make decisions about his or her personal information as a result of third party disclosures.   For example, an individual who discloses his genetic information also discloses the genetic information of his relatives.  As more information circulates and inferential profiles become more robust, any particular individual will lose the ability to make participation choices.

A market for privacy can only function effectively if there is transparency.  Yet, the privacy marketplace illustrates a classic problem of market failure. The actual information practices of business are largely hidden from public view.  In effect, the relationship between data processing organizations and individuals is typically based on asymmetric information: “the organization [has] the greater power to control what information about itself is released while simultaneously obscuring the nature and scope of the information it has obtained about individuals.”[19]  The barriers for individuals to discover how business use their personal information are frequently insurmountable.  At the same time, businesses profit enormously from a trade in personal information hidden from public view.  Victims have no means of recourse, and no independent mechanism exists to determine whether fair information practices are followed.  Under these conditions, the market does not and cannot afford individuals an opportunity to negotiate for meaningful fair information practices in the use of their information. 

The conventional response to the problems in the U.S. self-regulatory approach is the enactment of targeted statutes to fill the gaps in protection.[20]  However, the eclectic statutory response in the United States illustrates the limitations of this method.  Sectoral regulations are reactive and inconsistent.  For example, credit reporting agencies providing credit history information in connection with credit eligibility decisions are regulated,[21] but direct marketing organizations providing similar information for pure marketing purposes are not.[22]   This statutory gap-filling approach also leaves many areas of information processing untouched and runs counter to the cross-sectoral nature of modern data processing.

            Comprehensive data protection laws, however, are necessarily cross-sectoral and general.  But, the European model too presents its own set of problems that limit the self-sufficiency of the comprehensive regulatory approach.    Privacy is contextual and the interpretation of general rules in any specific context will often be extremely difficult and complicated.  In effect, general principles create a large margin for interpretation and implementation.  As a result, the ever-increasing complexity of information processing poses a fundamental challenge to clarity and fair treatment of both individuals and data users.

The ambiguity and application of general principles have a pronounced impact for online communications.  Often, comprehensive data protection laws diverge in significant ways.[23]  For example, privacy rights attach to information that relates to an “identifiable” person.[24]  Yet, the scope of an “identifiable” individual is interpreted quite differently under the various comprehensive statutes.  Some European countries take a broader view of the criteria for anonymous information and exclude more transaction related data from the statutory protections than others.[25]   For data transmissions within Europe, the consequence is that some countries may treat specific data as outside the jurisdiction of the data protection laws, while others will apply the full range of standards.

Enforceability presents another limit on the effectiveness of comprehensive data protection laws.  The credibility of data protection depends upon its enforceable character.  While European laws establish substantial enforcement mechanisms through penalties and data protection commissions, serious compliance issues with notice and registration requirements are nevertheless apparent.[26]  Public prosecution of data protection offenses, however, is not a common event in Europe even in the face of blatant violations.[27]  More importantly, transnational data processing challenges territorial enforcement powers.

            The international dimensions of data protection test each of the various models.   The dramatic rise of global service industries generates powerful conflicts and pressures among the political, economic and technical forms of regulation for data privacy.   While national statutory law and even private agreements have roles to play in the new global information age, an increasing need exists for international coordination of privacy protection.  The inevitability of conflict between comprehensive legal standards, as found in Europe, and ad hoc protections, as seen in the United States, place the issue of fair treatment of personal information at the center of global information transfers.  Even within Europe, transnational information processing poses conflicts among comprehensive, rights-based regimes.  Indeed, during the early 1990s, the differing national laws made the harmonization of data protection standards an essential component of the internal market plan.  Directive 95/46/EC sought to harmonize the domestic law of the Member States at a shared, high level of protection for “the fundamental rights and freedoms of natural persons, and in particular their right to privacy.”[28]   The strategy was two fold; first the Directive set out the mandatory, essential principles for personal data processing and then second required Member Nations of the European Union to bring their domestic law into full compliance with these standards.   However, the divergences in standards still allowed by the Directive’s permissible “marge de manoeuvre” left significant obstacles for online services.[29]  The Directive also forced scrutiny of foreign data protection regimes through the prohibition on transfers of personal information to countries lacking “adequate” protection.[30]  Since complex information processing arrangements often involve multiple jurisdictions, this provision brought the differing political and economic approaches of the EU and the United States into conflict.  At the same time, the emergence of the Internet and its sophisticated international data processing capabilities illustrated that technical rules were being developed in their own way without regard to national data protection standards.[31]  This meant that various deployed technologies might not have the capability for users to comply with local data protection norms.

 

III.             The Privacy Interdependence Model

 

The problems with each of the distinct models reflect that the three approaches cannot be isolated.   Indeed, the political, economic and technical approaches influence each other and provide important insights for the development of effective data protection.    The actual achievement of fair information practices requires the recognition of a privacy interdependence model.  The diagram below illustrates this model.

 

Privacy interdependence begins with an understanding of the policy constraints and rule authorities for data protection that derive from each of the three models.   Policy constraints are the mechanisms for establishing rules of data processing.  The political model uses law, the economic model uses market norms and the Lex informatica model uses technologies.   The actual rules of data protection are established by rule authorities.  Under the political model, the rule authority is a data protection law, while under an economic model the rule authority is self-regulation.  Under a Lex informatica model, the rule authority is a technical protocol.

As illustrated above in the diagram, the policy constraints do not operate distinctly on rule authorities.  Law affects technical protocols and self-regulation.   Some of the clearest examples of the interaction between law and technology arise in the context of cryptography.   Law has provided controversial limits on the availability of encryption products whether through export control regulation or licensing of products.[32] Similarly, law has motivated self-regulatory mechanisms.   Directive 95/46/EC was a major impetus to the creation of a cottage industry in the United States for the Internet of seal programs attesting to corporate privacy standards on web sites.[33]

At the same time, technology affects data protection law and self-regulation.   Technological developments influence both the need and direction of law.  For example, early data protection laws focused on “files” and file systems because the environment consisted of mainframe computers.   Today, distributed computing and wireless communications alter the processing relationship and the emphasis of modern data protection is on “controllers,” “processing” and “structured data.”  reflecting these technical development.  In addition, the globalization of networking meant that data protection law had to decide how it would treat foreign standards.  Europe opted for restrictions on foreign data transfers if foreign standards were too weak.[34]   Similarly, technology influenced the capability of self-regulatory mechanisms.  The early “cookies” technology allowed tracking of Internet users without their participation.  As users became alarmed and browsers became more sophisticated, “cookie” management options developed to allow users greater control over such tracking. [35]   This interdependence is also illustrated by the emergence of data protection laws that respond specifically to the technology.  Germany, for example, enacted a specific ‘cookies alert’ law to require that users be informed of the use of the cookies technology.[36]

Market norms also have an important impact on technical protocols and self-regulation.   Market participants drive the development of new technical architectures.  For example, as technology developed to give users greater control over the tracking of their behavior through the use of cookies, web sites and advertisers began to discover technical ways to circumvent user controls with web bugs or clear gif images.  The web bug takes advantage of certain features in the HTML commands that enable a web site or advertiser to force the users’ browser to load a single pixel image from a remote site.  This pixel is imperceptible to the user and cannot be blocked by the user, yet the action allows the web site or advertiser to track the user.[37]  By contrast, market norms reflecting the importance of privacy to electronic commerce have also been an important motivation to the development of a technical protocol that will allow web sites to disclose their privacy policies in a  machine readable manner.  This protocol, P3P, is being developed by the World Wide Web Consortium.  Similarly, market norms affect self-regulation in both positive and negative ways.  As privacy becomes a more salient concern for citizens, the ethos of certain industry players has adopted data protection as a critical business practice.  Prominent companies now sponsor the development of privacy tools.  But, to the extent that citizens are unaware of industry practices or that industry players follow codes of conduct written by trade associations, the market norms provide privacy public relations rather than true data protection.

The collective impact of the different policy constraints on rule authorities and the resulting rules themselves lead to the actual data practices in society.  Indeed, rule authorities are not independent[38]; each exerts an influence on fair information practices and the actual level of data protection.  Since the elements are not independent, effective data protection can only come from a combination of policy constraints and rule authorities working in concert rather working in opposition to each other.   The relationships of law, technology and market norms with data protection legislation, technical protocols and self-regulation are intertwined.  Each rule authority can undercut or support the goals of the other rule authorities.  For example, when data protection law seeks to inhibit the collection of personal information, technical protocols may be developed to require the identification of users or technical choices may be developed to create anonymity.  Similarly, the policy constraints may undercut the goals of each other and work against the goals of different rule authorities.   For example, market norms tend to favor data maximization for commercial gain while law prefers data relevance to balance citizen and social needs.   To the extent that such preferences are enshrined in technical protocols and self-regulatory measures, these rule authorities will contradict the goals of data protection law.  In short, there is an interdependence among law, technology and market self-regulation.

In this interdependent context, the protection of privacy can, thus, only be assured adequately through a channeling of policy constraints and rule authorities.   The elements must operate together in a consistent manner to promote effective data protection.   The channeling of policy constraints and rule authorities will revolve around four key conditions.  First, citizen participation in the design of law, technologies and markets is essential for effective data protection.  Citizen participation is necessary so that public values and goals are consistent across the three spheres of law, technology and market.  Second, anonymity in a digital age becomes a critical feature for technical systems and market products.   Anonymity built into information systems furthers consistency of privacy across law, technology and the marketplace .  Third, data minimization must be a cornerstone of law, technical architecture and market norms.  An insistence on the relevance of data for technical and market needs preserves coherence in the treatment of personal information across the three spheres.  Lastly, automation must play an important role in the assurance of data protection.   Mechanisms that automate the implementation of data policies will facilitate uniformity across the areas of law and marketplace.

Interdependence means that privacy technologies are necessary, that market norms need to adopt those technologies and that law must protect citizens.  Yet, in the context of data protection, market incentives and technological decision-makers do not regularly support fair information practices and result in consistent rules.[39]  For self-regulation and technical rules to act coherently in furtherance of effective data protection, a framework set of objectives must exist.   In democratic society, public goals and public values are traditionally set by political representatives through the legal system.  This means that law must establish the goals for data protection rule authorities.   Indeed, to channel technical rules and self-regulation to accord with legal rules, law can and must allocate liability to the market and to the network architects for their choices.[40]   In other words, legal liability rules become a key mechanism to provide the incentive for technical rules and self-regulation to develop in harmony with public goals.   This necessary incentive will promote the development and deployment of privacy-protective technologies and privacy-protective market actions.   If technologies must by law embody privacy protecting choices and if accountability must be built into self-regulatory regimes, then these rule authorities—technical protocols and self-regulation—will act in ways that are complementary to each other rather than develop in conflict with each other. 

           

Conclusion

 

The legal, technological and market models of fair information practices, though conceived as distinct rule sets, are in fact interdependent as tools for effective data protection.   This interdependence of law, technology and self-regulation demonstrates, however, that the three rule authorities need to be channeled in the same direction so that the rules support each other rather than frustrate each other.   Three guiding principles can be identified for this channeling of the rule authorities:

 

1.                  Law is necessary to establish the public policy objectives, but insufficient to assure the implementation of fair information practices.

2.                  In a democratic society, rule-making through technology must be shaped by public policy goals and debate.

3.                  Legal liability will be an essential instrumental device for the development of  privacy products.

 

The complex relationship among law, technical choices and market calls for ever increasing vigilance by citizens to the collection and use of their personal information.   An alert and active citizenry will remain a crucial defense against the erosion of privacy in the Information Age.



[1] Professor of Law and Director of the Graduate Program, Fordham University School of Law.  This essay was prepared for the conference “On the Brink of New Evolutions in the Law of Information Technology” in celebration of  the 20th Anniversary of the C.R.I.D., Nov. 7-9, 1999 with great appreciation and admiration to Dean Yves Poullet for the inspiration he has provided to so many in the field.

[2] See Council of Europe Convention for the Protection of Human Rights and Fundamental Freedoms, Art. 8; European Directive 95/46/EC; Council of Europe Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data, Euro. T.S. No. 108 (Jan. 28, 1981).

[3] See e.g. A Framework for Global Electronic Commerce (1997) [hereinafter “U.S. Framework”]

[4] See Joel R. Reidenberg, Lex Informatica: The Formulation of Information Policy Rules through Technology, 76 Texas L. Rev. 1315 (1998) [hereinafter “Lex Informatica”]

[5] See Paul Schwartz & Joel R. Reidenberg, Data Privacy Law (Michie: 1996)

[6] See Lawrence Lessig, Code and Other Laws of Cyberspace (Basic Books: 1999); Lex Informatica, supra.

[7] See U.S. Framework, supra, at 14 (Issue 5).

[8] See, e.g., About IETF, http://www.ietf.org

[9] Roger Clarke coined the phrase “dataveillance” to describe the practice of data surveillance through the capture of electronic trace information such as interactive traffic records.  See Roger Clarke, 'Information Technology and Dataveillance', Commun. ACM 31,5 (May 1988) <http://www.anu.edu.au/people/Roger.Clarke/DV/CACM88.html>

[10] See A. Michael Froomkin, A critique of WIPO’s RFC3 Ver. 1.0a (Mar. 14, 1999).  By the time public officials realized the implications of the WIPO work, much of the standard had been completed.  See Working Party Established under Art. 29 of Directive 95/46/EC, Third Annual Report, at 59, Doc. 5066/00/EN/final WP 35 <http://europa.eu.int/comm/internal_market/en/media/dataprot/wpdocs/wp35en.pdf>

[11] Internet Engineering Task Force, Internet Protocol, Version 6 (IPv6) Specification: Draft Standard, RFC2460 (Dec. 1998) http://www.ietf.org/rfc/rfc2460.txt?number=2460

[12] See Overview of the IETF, http://www.ietf.org/overview.html

[13] For example, the average Internet user is unlikely to understand “cookies” technology and less likely to know what to do about it.

[14] Typical web site privacy notices are so vague that even an informed user would have a difficult time ascertaining the response to these issues.

[15]  Deja News Privacy Breach Raises Red Flag, Information Security 13 (June, 1999)

[16] See RealNetworks Federal Class Action, http://www.internetnews.com/streaming-news/article/0,1087,8161_235141,00.html

[17] See Joel R. Reidenberg, Restoring Americans’ Privacy in Electronic Commerce, 14 Berkeley Tech. L. J. 771 (1999).

[18] See Paul Schwartz, Privacy and Participation: Personal Information and Public Sector Regulation in the United States, 80 Iowa L. Rev. 553 (1995); Spiros Simitis, Reviewing Privacy in an Information Society, 135 U. Pa. L. Rev. 707 (1987); Alan Westin, Privacy and Freedom 23-26 (1967).

[19] Philip Agre,  Introduction in Technology and Privacy: The New Landscape (Philip E. Agre & Marc Rotenberg eds., 1997), 11.

[20] See Schwartz & Reidenberg, supra.

[21] See 15 U.S.C. § 1681b

[22] See In re: Trans Union, Fed. Trade Comm’n Docket 9255 Opinion of the Commission, at 12-13 (March 1, 2000) http://www.ftc.gov/os/2000/03/transunionopinionofthecommission.pdf (noting that organizations not classified as credit reporting agencies may provide on an unregulated basis data that is similar, but not as reliable, as regulated data from credit reporting agencies.)

[23] See, e.g., Peter Swire & Robert Litan, None of Your Business: World Data Flows, Electronic Commerce and the European Directive 188-96 (Brookings: 1998); Joel R. Reidenberg & Paul Schwartz, Data Protection Law and Online Services: Regulatory Responses (Eur. Comm.: 1998)

[24] European Directive 95/46/EC, art. 2(a).

[25] See Reidenberg & Schwartz, supra, pp.  124-26.

[26] For example, the low number of registrations in countries such as France and an anecdotal examination of European web site privacy disclosure notices reflect compliance problems.  Indeed, a search for the required registrations of prominent online services providers in at least one European country revealed that highly visible companies failed to register and that this non-compliance was ignored.  See also Existing case-law on compliance with data protection laws and principles in the Member States of the European Union, Annex to the Annual Report 1998 of the Working Party Established under Article 29 of Directive 95/46/EC (1998)

[27] For example, the number of registrations in countries such as France or Belgium reflect a compliance problem.  Indeed, a search for the required registrations of prominent online services providers in at least one European country revealed that highly visible companies failed to registered and were apparently ignored.

[28] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Official Journal L 281 , 23/11/1995 p. 0031 - 0050

[29] Reidenberg & Schwartz, supra.

[30] Directive 95/46/EC, art. 25.

[31] See Recommendation 1/99 on Invisible and Automatic Processing of Personal Data on the Internet Performed by Software and Hardware of the Working Party Established under Article 29 of Directive 95/46/EC,  Eur. Doc. DG MARKT  5093/98 WP 17 - (23 February 1999) <http://europa.eu.int/comm/internal_market/en/media/dataprot/wpdocs/wp17en.htm>

[32] The United States, for example, regulates the export of encryption products while France has historically required the licensing of encryption products for use in France.   Similarly, the Computer Assistance for Law Enforcement Act, 47 USC §§ 1001-1010, in the United States mandates that digital networks be ‘wiretap-ready.’

[33] Truste and BBBOnline, in particular, sought to become a self-regulatory answer to the requisite level of protection required under Art. 25 of  Directive 95/46/EC.

[34] Directive 95/46/EC, art. 25.

[35] The latest versions of Netscape Communicator and Internet Explorer each now allow a variety of choices with respect to cookies that were not available in earlier browser versions.

[36] IuKDG, Art. 2.

[37] See Richard Smith, FAQ: Web Bugs <http://www.privacyfoundation.org/education/webbug.html>

[38] See, e.g. Lawrence Lessig, Code and other Laws of Cyberspace (1999).

[39] See Joel R. Reidenberg, Restoring Americans' Privacy in Electronic Commerce, 14 BERKELEY TECH. L. J. 771 (1999) http://www.law.berkeley.edu/journals/btlj/articles/14_2/Reidenberg/html/reader.html

[40] See, e.g., Lex Informatica supra.