Setting Standards - SSRN

3 downloads 0 Views 416KB Size Report
Sharon Eisner Gillet and Mitchell Kapor. The Self-Governing Internet. Key words: design for values, governance, intellectual property, Internet, open code, ...
Setting Standards

1 of 22

Setting Standards Looking to the Internet for Models of Governance Charles Vincent and Professor Jean Camp L. Jean Camp* [email protected] Kennedy School of Government Harvard University 79 JFK St. Cambridge MA, 02138

*

Charles Vincent Charles_Vincent @ksg.harvard.edu Kennedy School of Government Harvard University 79 JFK St. Cambridge MA, 02138

This work was supported by the National Science Foundation under Grant No. 9985433, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Setting Standards

2 of 22

Technically, what the Internet achieves sounds almost oxymoronic: decentralized interoperation1. -

Sharon Eisner Gillet and Mitchell Kapor The Self-Governing Internet

Key words: design for values, governance, intellectual property, Internet, open code, privacy, security, standards, technology and society ABSTRACT If code is law then standards bodies are governments. This flawed but powerful metaphor suggests the need to examine more closely those standards bodies that are defining standards for the Internet. In this paper we examine the International Telecommunications Union, the Institute for Electrical and Electronics Engineers Standards Association, the Internet Engineering Task Force, and the World Wide Web Consortium. We compare the organizations on the basis of participation, transparency, authority, openness, security and interoperability. We conclude that the IETF and the W3C are becoming increasingly similar. We also conclude that the classical distinction between standards and implementations is decreasingly useful as standards are embodies in code –itself a form of speech or documentation. recent Internet standards bodies have flourished in part by discarding or modifying the implementation/standards distinction. We illustrate that no single model is superior on all dimensions. The IETF is not effectively scaling, struggling with its explosive growth with the creation of thousands of working groups. The IETF coordinating body, the Internet Society, addressed growth by reorganization that removed democratic oversight. The W3C, initially the most closed, is becoming responsive to criticism and now includes open code participants. The IEEE SA and ITU have institutional controls appropriate for hardware but too constraining for code. Each organization has much to learn from the others. Abbreviations ITU International Telecommunications Union IAB Internet Architecture Board IEEE- SA Institute for Electrical and Electronics Engineers Standards Association, IETF Internet Engineering Task Force, RAND Reasonable and Nondiscriminatory W3C World Wide Web Consortium. INTRODUCTION Beyond merely automating or redesigning existing internal processes, information technologies provide the opportunity to alter workflow within and between organizations. Adoption of 1

Sharon Eisner Gillet and Mitchell Kapor, “The Self-Governing Internet” Coordination by Design,” in Coordinating the Internet ed. Brian Kahin and James H. Keller (Cambridge, MA: MIT Press, 1997), 6.

Setting Standards

3 of 22

information technology often implies taking organization processes into digital processes. Wellimplemented digital processes can be more fair (no psychological bias of the actor), far more speedy, transcend geographic barriers, and be available asynchronously. However, such transitions may also take subtle questions and force them into Boolean processes, remove transparency, and introduce subtle biases. Therefore the standards that govern such transitions must be carefully examined. Establishing, selecting or disseminating technical standards that ensure interoperability and flexibility in a decentralized decision-making environment while representing democratic processes is a challenge. Without a centralized body with the authority to dictate standards across political or functional boundaries, how can a government be sure that the system it is building will be compatible with other systems and subject to evolving standards? Without standards of oversight how can government select democratically appropriate standards? Yet the addition of governmental oversight and processes to certify transparency can result in failure to adopt commercially available standards, or even harm the standards process itself. Each of the organizations discussed here have struggled with issues of transparency, efficiency, and legitimately and have arrived at very different answers. The bodies and processes through which Internet standards are set offer several models that public sector organizations might consider in responding to this challenge. While there are many different bodies that claim varying degrees of authority in setting standards for interoperability on the Internet, this study will focus on four organizations that serve as “models of governance” for developing standards in a decentralized decision making environment.  International Telecommunications Union (ITU): The Government Model  World Wide Web Consortium (W3C): The Consortium Model  Institute of Electronic and Electrical Engineers Standards Association (IEEE): The Professional Association Model  Internet Engineering Task Force (IETF): The Open Model Standards combine social, organization, economic and technical variables to develop a technology that will become obdurate policy as it is widely adopted. The study of standards illustrates that communications and information technologies standards have unique features, particularly because of the strength of network effects that make these standards most difficult to change; the importance of communication in society; and the role of ICTs in commerce and government. (Cargill, 1989; Pincus, 1999; Schmidt & Werle, 1998; Weiss & Cargill, 1992; Shurmer, 1996 If a standard is to be adopted by government, the standard setting process must be compatible with the democratic requirements of public sector decision-making. As such, in assessing each model we need to consider the following questions: (1) Who has a voice in the process? (2) How open or transparent is the standard setting process? and (3) Where does the final authority lie for approving standards? In considering these questions, however, it is important to keep in mind that they do not lend themselves to a single correct answer. The democratic requirements of

Setting Standards

4 of 22

public sector decision-making vary both across different political jurisdiction and different situations. Indeed within ICT standards the policy implications of a technical decision vary from trivial to dominant. In addition to the democratic nature of the standard setting process, we consider the inherent qualities or characteristics of a standard that would be produced by each model. While there are many characteristics by which a standard can be defined, this paper selects examples that address (1) openness (2) security (3) privacy and (4) interoperability. WHAT IS THE NATURE OF THE PROCESS?

Broadly speaking, if a standard is to be adopted by government, the standard setting process needs to be compatible with the democratic requirements of public sector decision making in that society. In other words, the rules that govern the standards process must be consistent with the degree of transparency, inclusiveness, and accountability required by other decision-making processes in the public sector. Therefore, in assessing each model we need to consider the following questions: (1) Who has a voice in the process? (2) How open or transparent is the standard setting process? and (3) Where does the final authority lie for approving standards? Who has a voice in the process? ITU-T Participation in the ITU-T2 standard setting process is limited to ITU-T membership – namely national governments (members) and select telecommunications companies (sector members). Members and sector members are the only organizations with a direct voice in the standard setting process. Unless called as an expert consultant, non-ITU members do not have an avenue for participation. Since the ITU representatives of member states are also public officials, the general public can voice their opinions and thoughts indirectly through their domestic political process. However, given the distance between the general public and the ITU standards process, this link is tenuous.3 In an attempt to engage the Internet community directly the ITU have developed a set of open meetings modeled, apparently, on the IETF. The ITU has sought civil society actors on the Internet to engage at the World Summit on the Information Society. There is as yet no consensus on the outcome of the WSIS events, with some declaring them irrelevant and others believing that these are so successful that the ITU will easily overtake the functions now provided by the Commerce Department via ICANN. Individuals cannot join the ITU. 2

The ITU-T is the Telecommunications Standardization Sector of the ITU. Factual information pertaining to the ITU-T was collected at http://www.itu.org unless otherwise noted. 3 Valerie Shuman and Richard Jay Soloman also note that the ITU standardization process is highly political. See “Global Interoperability for the NII and ITS: Standards and Policy Challenges.” In Converging Infrastructures. Ed. Lewis M. Branscomb and James H. Keller. Cambridge, MA: MIT Press, 1996.

Setting Standards

5 of 22

IEEE The IEEE4 limits participation in the standard setting process to the electrical and electronic engineers that form its membership. In fact, the right to participate in setting standards is considered a benefit of membership. Valuing a diversity of opinion in the standard setting process, the IEEE does invite public sector agencies to become members of the IEEE Standards Association. Nevertheless, participation is still limited to fee-paying members, whether individuals or invited organizations. Interestingly the largest growing sector of the IEEE is the Technology and Society. Any IEEE member can sponsor up to two non-engineers for membership by writing a letter describing their contributions. IEEE Technology and Society includes a significant number of those members. W3C W3C is a consortium and is thus distinct from internal standards setting by proprietary companies. (Updegrove, 1995) Similarly, membership is a requirement for participation in the W3C5 standard setting process. Membership is open to any organization willing to pay the $US 50, 000 annual membership fee ($US 5000 for government agencies and non-profits), but is dominated by private companies. Individuals cannot be members. The Free Software Foundation (until the recent change in buildings) was physically adjacent to the W3C in the building on MIT campus housing the Laboratory for Computer Science. Yet there was no FSF participation in the W3C until after a particularly ugly patent controversy. (The Laboratory for Computer Science and the Artificial Intelligence Laboratory have now merged. LCS, the AI Lab, W3C and the FSF have all moved to the new Stata Center.) Individuals cannot join the W3C. Individuals who are not members cannot participate on in some mailing lists but are allowed to present criticisms at specific forums designed for feedback. The proceedings of such forums may or mat not be made public, based on the decision of the team. While the W3C agenda is largely member driven, even members do not tend to participate in the development process itself. The W3C Teams that are responsible for developing standards are composed of the Consortium’s technical staff (full- and part-time employees around the globe, though primarily at MIT, INRIA and Keio) along with visiting engineers from member organizations, consultants and students. Member input is sought primarily through periodic working drafts and through the approval process. The W3C will also publish drafts to seek comments from the public, but this practice is not a required step in the standards process. IETF In sharp contrast, the IETF6 standard setting process is open to any interested individual. The IETF does not have a formal membership. Anyone who wishes to participate in the standards 4

Factual information pertaining to the IEEE was collected at http://www.ieee.org unless otherwise noted. Factual information pertaining to the W3C was collected at http://www.w3c.org unless otherwise noted. 6 Factual information pertaining to the IETF was collected at http://www.ietf.org unless otherwise noted. 5

Setting Standards

6 of 22

process through working group mailing lists and tri-annual meetings is free to do so. Furthermore, since working groups are established from the bottom-up by groups of interested individuals, the direction of the IETF is ideally entirely dictated by the participants.7 However, approval is a process that is far from transparent. IETF may struggle for many years without reaching consensus. While any person can join, advancing an agenda is tenuous and remarkably uncertain. The dominant currency in IETF standards-setting is social capital. An illustration of this can be found in the passionate member discourse over a proposed standard to enable the tracking of specific connections or devices for law enforcement. The membership could not reach consensus on such a standard, as many were deeply opposed to the simplification of surveillance on principle. Others desired a standard to encourage narrow or at least consistent surveillance. Others consider the politics to be orthogonal, and believed that IETF should pursue efficiency and interoperability in all things. After an extended period of struggle the issue apparently died with the determination that the IETF neither supported nor opposed implementing a surveillance standard. However, Cisco Systems, the dominant router provider, had to implement this functionality both to compete for contracts from national PTTs and to serve American ISPs required to provide access to law enforcement. Fred Baker, past IETF President and long time senior Cisco employee, is rumored to have effectively approved the proposed wiretapping standard. Cisco could have implemented the standard in a proprietary manner, without a standard, and the IETF issuance added valuable transparency. The IETF standard prevented Cisco from leveraging this capability to attempt to establish a monopoly via limited interoperability. However, the issuance also encouraged others to adopt it, and may have provided political coverage to Cisco. Thus the standard had positive and negative implications, while the adoption arguably reflects the reality of industry, social capital, standards and transparency at the IETF. How open or transparent is the standard setting process? ITU Like the W3C, the research agendas for ITU-T study groups are also published for public consideration.8 Similarly, official decisions and recommendations to the ITU-T can be found at the ITU web site. Draft recommendations and discussion papers, however, are only available to members and are password protected on the web site. Moreover, there is no public record of the discussions and debates that occur via email or in the meetings that form the core of the standards development process. IEEE The majority of the IEEE standards process also happens behind closed doors. Whereas members have access to all working documents produced by working groups, these documents are not available to the public at large.

7

Scott Bradner, “The Internet Engineering Task Force,” in Opensources: Voices from the Open Source Revolution, ed. Chris DiBona, Sam Ockman and Mark Stone (Cambridge, MA: O’Reilly and Associates, 1999), 51. 8 ITU-T Study Group information can be found at http://www.itu.int/ITU-T/index.html

Setting Standards

7 of 22

The IEEE Standards Association is lead by a Board of Governors. The Board is elected by IEEE members. Corporations and other legal but fictitious persons cannot vote for the Board; however, no one expects the engineers to vote against their own person interests. The Internet Standards Association meetings are publicly announced on the web site. The IEEE SA uses the working group structure to develop standards. A list of working groups is available at http://grouper.ieee.org/groups/index.html. The most recent working group, Voting System Engineering is by far the most political. IEEE standards are more likely to be found for physical interconnections than higher level applications. Like the ITU standards, those who want to use the standards see nothing of the process that developed them, and are privy only to the final product. However, it is unlikely that anyone capable of implementing an IEEE standard is without IEEE members. W3C In practice, the W3C’s standards process incorporates some elements that foster openness. For example, the periodic publishing of drafts for public consumption and comment give nonmembers a window into what the W3C is working on. Besides publishing their research agenda and these periodic drafts, however, the W3C process is closed. The W3C Teams responsible for developing standards do not make the substance of their meetings or debates public, leaving nonmembers (and many members) without any indication of the path that was taken to arrive at a standard or the reasons behind design and development decisions. However, internal records based on dates are kept, according to the Semantic Web presentation by Tim Berners-Lee which used these records as an example application. (Lee, 2001) The W3C expanded its membership to include participants from the open source movement after a heated public debate on W3C patent policy. After thousands of objections to the RAND9 patent proposal for W3C standards, Eben Moglen, general counsel of the Free Software Foundation, and Bruce Perens were invited to join. Eventually the W3C determined that a zero-fee requirement would be necessary for members who have intellectual property included in standards. The patent policy was necessary to allow corporations to continue to participate without concerns for loss of control and to avoid submarine patents.10 The zero-fee was necessary to prevent free and open implementations of W3C standards. Thus, while there is no direct participation public outcry did result in change in the patent policy for the W3C.

9

RAND is an acronym for reasonable and non-discriminatory. The original proposal was that companies provide RAND licenses for patented material in any standards. However, open source and free software developers would be prevented from using the standards in their products if forced to pay as their user base expanded. 10 Submarine patents are patents on ideas or concepts that are held without publicity until the research necessary to implement the idea is completed by a party other than the patent holder. Then the patent emerges and the patent holder demands royalties from the researchers (the true inventors). Submarine patents are a particular problem in the United States, as the US PTO is structurally designed to issue rather than refuse patents, US patent examiners are far from skilled in the necessary arts, and US patents are notoriously hard to overturn. Patents have been identified as an impediment to innovation in software. (League for Programming Freedom, 1992; Besen, 2003) Patents are so trivial to obtain in the United States that the child’s swing was patented in 2003.

Setting Standards

8 of 22

IETF Just as participation in the IETF standard setting process is open to all interested individuals, so too the documents (RFCs), mailing lists, and meetings of the IETF and its working groups are open and accessible to the public.11 Every step and document in the IETF standards development process is open for consideration by participants and observers alike. Since the majority of standards work is done through the mailing lists of working groups, the core of the IETF standards process is completely transparent. Moreover, mailing list archives ensure interested individuals can review the process and thoughts that led to a particular standard or decision. The standards approval process is distinct from the development process. Currently the number of standards in progress numbers in the thousands. IETF meetings fill entire hotels and conference centers with participants numbering in the thousands. The goal of the IETF is as follows: "The procedures that are described here provide a great deal of flexibility to adapt to the wide variety of circumstances that occur in the Internet standardization process. Experience has shown this flexibility to be vital in achieving the following goals for Internet standardization: • high quality, • prior implementation and testing, • openness and fairness, and • timeliness." Internet Architecture Board 1994

At the lowest level the IETF is organized via the Working Groups. Working Groups are topical, and work on specific standards. An example of a working group is the working group on multilingual domain names. This group is developing standards to ensure that the translation of domain names between alphabets and ideographs; for example, Hebrew, Latin, Chinese speakers ideally should all be able to use domain names in their native tongues. There are four RFCs covering international domain names: stringprep, IDNA, nameprep and Punycode. In the multiyear process of developing this set of RFCs the standardization has moved forward, lead by Verisign, and another alternative effort pushed by the mainland Chinese government. An IETF standard would increase competition and expand the market. The IETF motto is “rough consensus and running code.” Standards are set through rough consensus and there is no formal voting procedure in working groups. Standards are officially sanctioned by the IESG whose members are appointed based on recommendations from the broader membership, but the “rough consensus” is achieved within the membership of the working group (all interested individuals) when there is running code.

11

All RFCs and IETF documents can be found at http://www.ietf.org/rfc.html and are available in ASCII format.

Setting Standards

9 of 22

Where does the final authority lie for approving standards? On the surface, the models do not appear to differ significantly in terms of where final authority lies for approving standards. Each uses a form of working group to develop standards, which are then approved by a review committee. When we consider where these review committees – and in turn the standards – derive their authority, however, the models vary dramatically from a democratic perspective. The creation of standards is an inherently technical and detailed process, not easily subject to popular debate. Indeed the role of technical expertise in a democracy is a perennial question. IEEE The review committee that approves IEEE standards is an appointed body that derives its authority from the IEEE Standards Association (IEEE-SA) Standards Board. While the Standards Board is also a non-elected body, the Board of Governors – who are elected by the IEEE membership – appoints these members. The Board of Governors is elected by fee-paying members – a group that is not likely to be representative of the general public but rather only those with technical expertise. Currently the IEEE has roughly a quarter of a million members, but only a fraction of those participate in the standards process. ITU-T The ITU-T has a similar approval process to the IEEE-SA, where an appointed review committee derives its authority from a governing board that is elected by the membership. Since the ITU is an international geopolitical body whose members include public officials from 189 nations, the approval process would seem to have some semblance of a democratic foundation. This link, however, is weak when we consider the distance between the review committee and the public. Moreover, with only 189 of the world’s 266 countries as members, the ITU cannot represent all nations. W3C Final authority for W3C standards lies with the W3C membership rather than a review board. Specifications must be accepted by the membership through a formal approval process that focuses on consensus. While the need for approval from the membership serves to broaden the base of authority, the W3C’s membership cannot be considered representative. TheW3C has 390 members. IETF As RFC 2026 (The Internet Standards Process) describes the ideal, “an Internet Standard is a specification that is stable and well-understood, is technically competent, has multiple,

Setting Standards

10 of 22

independent, and interoperable implementations with substantial operational experience, enjoys significant public support, and is recognizably useful in some or all parts of the Internet.”12 Essentially for the IETF an implementation is a standard. The process of diffusion is the process of standardization. In particular in the arena of defaults and detailed specifications, the IETF process is explicit in the desire to enable the implementation and diffusion processes drive the specification. In contrast, details are embedded in IEEE SA and ITU-T standards. The organization of the standards bodies was a significant element in the OSI seven-layer stack that was doomed by lack of simplicity in the contest for diffusion and dominance with the far more flexible TCP/IP and UDP/IP This flexibility enables rapid responses and encourages rapid prototyping and proofs of concepts. Technical documentation of implemented devices and evangelical declarations of working services are an element of IETF discourse. By requiring implementation the IETF addresses issues of flexibility and efficiency in the most applied manner possible. Theoretical disputes about functionality are resolved in implementation details. A secondary effect of the implementation focus of the IETF standards process is that it creates an inherent hierarchy based on willingness to invest in a particular question. This has both positive and negative implications that can bee seen by the standards that have been languishing for many years in the IETF. On the positive side, individuals with a vision can embark on any potential standard if there is enough interest. The standard group need not die Yet in fact the decision making process in the IETF is far from transparent. Working groups are created by explicit task-specific charters. Charters are granted by IETF Area Directors. Area Directors can create (or refuse to create) groups. Area Directors have the option of adding a consultant who provides guidance to the group in addition to a Chair. Any person can propose a group. However, the proposal must be approved by the Area Director and the Internet Engineering Steering Group. The IETF Secretariat performs all administrative functions. This leads to the question as to the membership of the Steering Group. The Steering Group consists of members nominated by a committee and approved by the Internet Architecture Board. Anyone can be nominated and self-nominations are permitted. However, once nominations are accepted the Nominations Committee votes to make recommendations to the Internet Architecture Board. The Internet Architecture Board then confirms the selection. If the membership strongly objects to a selected person, then it is possible for a person to be recalled. A recall requires approval of 75% of the members who vote on the question. The vote would be taken at an IETF meeting.

12

Scott Bradner, “The Internet Standards Process: Revision 3,” Updated October 1996, < ftp://ftp.isi.edu/innotes/rfc2026.txt> (cited 10 January 2000).

Setting Standards

11 of 22

IETF initiation of a standard is presented to the community by the RFC editor. Originally RFC stood for, and meant, request for comments. Currently a standard is effectively accepted when it reaches RFC stage. The editor of the RFCs is a powerful person and can block or release RFCs. The initial editor was Jon Postel, also the manager of the Domain Name System. Now RFCs are edited by the Internet Society. There is no documentation on the selection of RFC editors nor could this author determine their identities from publicly available information. The system depends upon the integrity of the Internet Architecture Board, an organization with many founding members approaching retirement in the next decade. The next generation is grounded not in the tradition of engineering in the academy, but on the less firm ground of the dot com mania. Indeed some of the founding members have left public service with, for example, Vint Cerf moving from the Department of Defense's Advanced Research Projects Agency to the Corporation for National Research Initiatives and then to MCI WorldCom. Regardless of the individual there are structural democratic issues with an organization headed by an employee of a major corporate organization as opposed to a career public servant. The personal integrity of Vint Cerf provides assurance to all IETF members; however, the precedent will remain after the individual has departed. There is no reason to believe that the next generation IAB will be of the quality of the current IAB, as the contenders for the post have excelled in a highly political and economically charged environment, rather than in an open environment of competitive expertise. The Internet Society holds copyright on RFCs and the Internet Society President appoints the non-voting Chair of the Nominating Committee. Internet Society Board Members and Internet Architecture Board members cannot sit on the nominating committee. The Internet Society was at one time a member organization in that members choose the Board and were provided with regular policy updates. Thus, the Internet Society is currently an industry consortium which defines members as companies. Given the importance of the Internet Society and the IAB, the W3C and the IETF are most similar. Neither has significant public participation in the approval process; however, both organizations may respond to public opinion. Summary These models do not offer a single correct answer to the question of developing standards for the Internet and the information society. The IEEE is governed by a Board elected by individuals and most closely represents the democratic ideal. The W3C and the IETF function as industry consortium, which are effective in creating standards. However, unlike most consortia these provide some degree of open records and thus the ability of the public to respond. The IETF has many participants. Yet the IETF has no individual members with full voting power on those who approve the final standards, since the IETF is an organized activity of the Internet Society. The ITU is the most formally democratic, yet the scale of the population to be represented results in so many layers as to create an effectively isolated organization. Thus, ironically, the most elite is the most democratic (IEEE) and the least transparent in terms of approval (W3C and IETF) are the most transparent in process.

Setting Standards

12 of 22

WHAT IS THE NATURE OF THE ORGANIZATION AND THE STANDARD? In addition to the democratic nature of this hypothetical standard, we must also consider the inherent qualities or characteristics of a hypothetical standard that would be produced by each model. Depending on the model used, a standard would differ in terms of (1) openness (2) security (3) privacy and (4) interoperability. Openness In the context of this paper, openness refers both to whether the standard is distributed without cost, and to the standard’s technical openness. Once obtained do users have complete access to the standard’s specifications so that they can use it in anyway they want and develop or build something compatible with the standard? In comparing the four models in question, it is important to reiterate the point that openness does not mean free of charge. Both the ITU and the IEEE SA charge a fee for the documentation that details the standards they develop. Once purchased, however, the user is free to use the standard in any way they wish. The user has access to the standard and its specifications, and can use this information to construct systems that are compatible with these standards. This is not to say that restricting access to standards by charging a fee and maintaining intellectual property rights does not have implications for the standard in question. The fees that the ITU charges for documentation, for example, can effectively serve as a barrier to use and adoption by innovators in developing nations and not-for-profit institutions. The ITU accepts patented content in its standards on the basis of RAND. The IEEE SA has a patent policy that allows for RAND licensing for patents contained in a standard. However, the alternative and preferred implementation is that the owner of the patent will not assert ownership (including requiring payment) against anyone using the patented technology only for implementing the IEEE standard. Different working groups may choose to accept or decline technologies that will be subject to ownership and licensing fees. This flexibility enables software standards groups to follow policies distinct from those of hardware standards groups. By maintaining intellectual property rights over their standards, the ITU and IEEE SA licenses can limit the dissemination, and arguably the evolution, of their standards within the Internet community. Because there is a free software community (there is no comparable ‘free coaxial cables’ or ‘free accelerometer’ community) patent polices that worked well in physical standards can create harm to the open code community. The economics of software create an open source and free software possibility that does not exist for physical standards. The nature of software creates transparency issues that do not exist for physical standards. Both the ITU and the IEEE SA were designed, as organizations, to create hardware standards. However, the IEEE SA working on voting illustrates how the process, open in theory can be closed in practice. IEEE members must apply to be members of IEEE Voting Equipment Standards Project 1583 by downloading and completing an application, and participation is not

Setting Standards

13 of 22

certain. This is a case of regulatory capture, where the putatively regulated institutions define the standards. Similarly American members of the IEEE are free to join any USA policy committee except the Intellectual Property Committee (IPC). In the case of the IPC members must apply, become consulting members with no vote, and then are allowed to become full members only after having been vetted. In fact, one member (Mike Godwin) was summarily removed during a debate about possible IEEE support for a bill requiring federal mandates requiring digital rights management software. This control explains why the IEEE USA IPC has consistently supported expansion of intellectual property rights, including those that favor content companies (e.g., move studios, recording companies) over communications and computer companies. Thus despite very strong IEEE policy on the rights of member participation, when the political stakes are perceived as very high these policies can be subverted. IEEE has a policy of openness, but inadequate policing mechanisms for that policy. The IETF allows RAND or more stringent license requirements in its technology. A list of patents in IETF standards can be found at http://www.ietf.org/ipr. The method of pricing can prohibit adoption and future development. In fact, some have argued that the driving force behind the adoption of Internet protocols such as TCP/IP was their open and free availability.13 The point here is, however, that a standard can be open and not free of cost. The IEEE SA and ITU tradition of hardware standards have created policies that fail to fully take advantage of the possibility of openness in software standards. There are no physical products that are closed in the same sense as closed codes. The IEEE SA and the ITU standards bodies are built on an assumption that wires can be sliced and spliced, electronics can be disassembled. The IEEE USA policy supports reverse engineering in policy, but the IEEE SA assumes it without policy in practice. The W3C makes the specifications of their standards freely and openly available to users; however, the number of voices in standards development is the least in the W3C. The W3C attempts to mitigate its industrial closed nature by providing more detailed standards. Like the IETF, however, the W3C standards include running code before they are approved. Therefore, in addition to the raw specifications, users have access to implementations illustrating how the standard can be used in the real world. This increases interoperability as there is an exact specification. Yet the W3C can under specify information in its standards, and remain vague about those specifications even in implementations. For example, identical statements about privacy as expressed in the Platform for Privacy Preferences can result in different outcomes. The results from the possibility that one data element might fall into two categories (buying a book is shopping and reading habits, for example) and there is no specification of which take priority. The IETF standards process is similarly focused on the need for running code – “to fly before you buy.”14 The standard follows the implementation of code, and the standard/code are

13 14

Eisner Gillet and Kapor, 8. Bradner (1999), 51.

Setting Standards

14 of 22

debated and altered as one. A discussion about a standard may be embodied in a code snippet or prose. In terms of technical openness the IETF provides to the broader community access to the documents and debates that lead to the final specifications. Yet the discussion that leads to rejections, alterations or approvals are not available. The W3C and the IEEE SA have internal documents that record these debates, but they remain private. IETF documents including not only Technical Specifications, Applicability Statements but also mailing list archives and draft papers, are available to all over the Internet. These documents give users a more complete understanding of the standard and what it is designed to do. As Bradner stresses, “restricting access to work-in-progress documents makes it harder for implementors to understand what the genesis and rationale is for specific features in the standard, and this can lead to flawed implementations.”15 IEEE SA has had to struggle with flawed implementations; however, none has been plagued by this problem more than the ITU. The W3C, despite being a consortium, has the most open patent law and provides model code. The IEEE SA is second, with ITU supporting RAND. The IETF is unpredictable with some quite open standards and others requiring significant licenses that may or may not be forthcoming from the owner of the patent. Security The secure nature of a standard is derived from two separate but related sources: the number and quality of people who developed it; the quality of the review; and the maturity of the standard. On the surface it appears that the open standards developed through each of these four models are extremely vulnerable to security breaches. With full access to the specifications, malevolent users can examine the documentation to find holes and then exploit them. While not working with criminal intentions, researchers at Purdue University highlighted the potential danger when they revealed a significant hole in the Kerberos network authentication system, a highly regarded open standard, more than ten years after it was first released. Despite the fact that a patch was quickly distributed, for more than ten years hackers could have used this hole to penetrate Kerberos-protected systems. Similar security problems have also been associated with other high profile open projects such as sendmail and fingerd.16 Without dismissing these examples, however, it is clear that open standards are actually more secure than proprietary standards. (e.g., MITRE, 2001) By allowing all interested individuals to view the specifications, any vulnerabilities are discovered and addressed before they can be exploited. Unlike the “security through obscurity” model of proprietary standards that assumes a hole is only a problem when it is revealed, the open standards model proactively seeks to develop standards that are free of vulnerabilities, including logical errors and coding errors. The inherent security problem with proprietary standards has been well illustrated by the continuing plague of worms attacking proprietary systems. Blaster, Slammer and more recently 15 16

Ibid., 52. Simson Garfinkel, “Fiascos,” 14 Nov 1999, http://www.wideopen.com/story/102.html (cited 5 January 2000).

Setting Standards

15 of 22

Sasser have illustrated the chronic failures in security. The difficulty of finding vulnerabilities is best illustrated by the fact that the average time between vulnerability exposure and worm attack is three months. Only in one case has malicious code been released that used a previously unknown worm, and that was the Morris worm, written by an extremely talented graduate student who went on to become faculty at MIT. While few cases are publicized, a 1999 break-in to the CD Universe web site at the height of the boom and retrieved more than 300, 000 credit card numbers. After Cybercash (the proprietary vendor of the secure transaction software) and CD Universe have hired security consultants to try and close the hole, they still do not know how the hacker got into the system. Unlike the Kerberos example where a hole was discovered and patched by an open community of developers, CD Universe was at the mercy of their proprietary vendor and a handful of consultants who are scouring millions of lines of code. In short, security is partially a function of the number and diversity of people developing, testing and using the standard; “given enough eyeballs, all bugs are shallow.” Thus the implementation is most critical in terms of a secure standard because of the potential for review. From this perspective the W3C and the IETF provide the most secure standards as the implementations can themselves be examined and adopted. Shah argues the security and privacy are also a function of the organization of an institution. In a study of cookies the security and privacy features added to cookies were added by not-for-profit open institutions, while those features that promised gain but created risk were instituted by forprofit institutions. (Shah & Kesan, 2004) The IETF and W3C standards processes also enhance security by requiring independent and interoperable implementations before a standard is approved. This step in the process ensures that a standard has moved beyond the idea stage and has been demonstrated to work in practice.17 Used and tested by more people in more situations, such standards are more likely to have been purged of security flaws. The IEEE tries to increase the number and diversity of participants by enabling government agencies and other organizations to join the IEEE-SA. Similarly, by posting drafts for public comment, the W3C tries to broaden its development base. All four organizations can find their intentions undermined by software producers that create flawed products, only open code built into an open process can mitigate this risk. The degree of openness is hotly contested, with some supporting limited code review and others advocating that only completely open (know as free as in libre, not gratis) software is acceptable. Privacy Privacy concerns related to technical standards have traditionally focused on the potential for malevolent hackers to intercept personal information while it is being transmitted or to retrieve it from vast databases (e.g. CD Universe). Essentially, protecting privacy has been associated with improving security. In addition, privacy requires transparency so that individuals can know what

17

Bradner (1999), 51.

Setting Standards

16 of 22

information about them is collected and compiled. (Camp, 1999; In technical terms, privacy can be implemented with anonymity. (e.g., Chaum, 1992). With the rise of the World Wide Web, however, the surreptitious aggregation of data is a growing risk to personal privacy. As online commercial sites become more popular, databases of personal information are growing. Moreover, most visitors do not even realize the information that is in their personal profile. Beyond the books that they buy and the tickets that they reserve, these companies are also collecting information based on audits of where people click, how long they stay there, and from what IP address they call. Combined with data from other sources (authorized or unauthorized) the collection and manipulation of this information can be viewed as a significant and increasing risk to personal privacy. In this context, the potential for open standards to augment individual privacy was highlighted in April 1999 by the way federal law enforcement officials were able to track down the person who originated the Melissa virus. Federal officials, with the help of a computer hacker from Cambridge MA, were able to identify and locate the computer on which Melissa was originally programmed using the computer’s serial number (or global unique identifier) that is surreptitiously embedded on all Microsoft Word documents.18 At the core of this debate is the principle of transparency. Namely, individuals must be aware of all the data that is collected about them, and must have the ability to check the information’s validity. By creating and collecting information (unique identifier) without making people aware, Microsoft was violating the principle of transparency entrenched in both the U.S. Code of Fair Information Practice and the European Union’s Directive on the Protection of Personal Data. The open standards produced by each model enhance transparency, and therefore privacy, relative to proprietary standards. Since the specifications for the standards are open to all users, it is not possible to collect personal information surreptitiously. The IETF’s practice of making all working documents openly available further enhances privacy. Not only can people look at the specifications, but they can review the debates and decisions that led to the standard, thereby better understanding what each feature of the standard is designed to accomplish. The IETF and W3C practices of providing open code implementations assure that there can be no misunderstanding of the intention of the standard. Despite this the W3C and IETF have standards that threaten privacy. In terms of interoperability, the W3C’s ongoing experience with P3P (Platform for Privacy Preferences) is another example to watch carefully. While industry leaders such as Microsoft (W3C member) and Netscape appear likely to integrate P3P into their browsers, there is significant criticism of the standard from the Internet community and groups such as the Computer Professionals for Social Responsibility (CPSR).19 Commentators have noted that the Platform for Privacy Protection, in its early and current implementations, functions more effectively functions to increase the efficiency of the harvesting of user data. Regardless, the P3P experience highlights the risk to privacy created by the W3C’s decision to maintain an closed process dominated by a small number of corporate members. If a standard does not share wide spread support it may not be 18

John Markoff, “When Privacy is more perilous than the lack of it,” New York Times, 4 April 1999, Section 4, Page 3. 19 Computer Professionals for Social Responsibility, “Some Frequently Asked Questions About Data Privacy and P3P,” updated 7 November 1999 http://www.cpsr.org/program/privacy/p3p-faq.html (cited 21 December 1999).

Setting Standards

17 of 22

accepted or adopted by other standards setting organizations. In the case of P3P users did not choose to adopt the technology as a stand-alone product. However, major browser vendors have installed P3P in their browsers, so W3C members have used their product to diffuse this standard into the marketplace. There is a debate on this choice, some argue it enhances privacy, that half a loaf is better than none. Others argue that automating information flow decreases privacy. AT&T has used P3P to develop a simple interface to identify privacy polices as good, acceptable, very bad, or non-existent. The “privacy bird” uses P3P to offer information to users without automating or simplifying information exchange. The IETF is working on a directory service for ENUM. ENUM is designed to create a database that will provide identifying information for all users. ENUM would correlate telephone numbers and email addresses, providing a single universal look-up point. ENUM would function only if providers updated the information, requiring users to make entries would prevent widespread efficacy. IETF has supported the functioning of the whois database, requiring that record be made readily available to the public despite the continuing misuse of those records by parties ranging from spammers to overly aggressive intellectual property lawyers despite the outcry of civil society advocates and civil libertarians. IETF arguably requires more than the necessary technical contact information – billing information is also required although there is no technical reason. In terms of privacy the IEEE SA has not implemented privacy –threatening standards. However, this may be more of a result of the low level, physical nature of standards created by IEEE SA than its commitment to privacy. Among all the organizations only IEEE has a stated commitment to privacy and open code. Yet these commitments were made by IEEE USA, the American policy arm of the IEEE, and not by the IEEE SA. However, other elements of IEEE (including IEEE publications) have altered policies to confirm to IEEE USA policy recommendations. The outcome of the IEEE Voting Equipment Standards Project 1583 will be a test of the IEEE SA process. The voting standard can enhance privacy and security, or threaten either. Currently the voting standard is hotly contested, with an early vendor-friendly and unreliable standard having been defeated. As of this writing the process is not complete. Interoperability A final characteristic to consider is the degree to which a hypothetical standard produced by each of these models would be interoperable with other standards. Without a centralized body with the authority to dictate standards across political or functional boundaries, how can a government be sure that the system it is building will be compatible with other systems and subject to evolving standards? The interaction between the IEEE SA and the IETF offers an interesting case study in the interoperability of standards. The IEEE publishes the 802 family of specifications that define standards for local area networks (LANs) dealing with the physical and data link layers. Since the specifications of these standards are open to those who purchase them, the IETF was able to study them and define new standards that are compatible with the old standards. For example,

Setting Standards

18 of 22

RFC 1042 defines new standards for how IP datagrams and ARP requests and replies can be transmitted over 802 networks.20 By making standards open, each of these models makes it more likely that their standards will be considered by other bodies looking to set standards. As the GSM/CDMA debate highlights, however, openness does not ensure future interoperability. While the ITU-established GSM (Global Standard for Mobiles) standard for digital mobile systems is used throughout Europe, the dominant standard in the Americas is CDMA (Code Division Multiple Access). In fact, including PDC and D-AMPS there are four dominant standards for digital phone systems, none of which are compatible with any of the others.21 It will be interesting to see whether the ITU is more successful with its IMT-2000 standard for third generation systems (3G). Table 1: Digital Mobile Standards Americas Europe, Africa, Middle East CDMA D-AMPS PCS1900 CT2, PWT, PACS

GSM DCS1800 DECT CT2

Asia-Pacific Japan Others PDC CDMA PHS

GSM CDMA CT2

As Scott Bradner points out, “it is only the standards that meet specific real-world requirements and do well that become true standards in fact as well as in name.”22 Again the W3C and IETF practice of creating “standards in fact,” is likely to make future standards interoperable. However, that practice also prohibited the organizations from being effective standards organizations in some domains. The W3C and the IETF would be unable to set standard in more mature markets where example code is more of a market threat than an indicator. For example, neither the W3C nor the IETF has been able to produce a widely adopted personal identifier protocol. The industry-lead process notoriously failed in the FCC policy choice to allow a market-created standard for AM stereo, as the standard had to precede the market. CONCLUSION The goal of this paper is not to promote one standards process as the “right” standards process, but to present the strengths and weaknesses of the various standards bodies. As Table 2 summarizes, while there are similarities between them, the four standards processes differ on several fronts. If we interpret these standards processes as models of governance, the differences are especially significant. A government must consider the varying degrees of participation, transparency, and accountability embodied in each model when determining which are acceptable in the context of a democratic society. 20

J. Postel and J. Reynolds, “A Standard for the Transmission of IP Datagrams over IEEE 802 Networks,” updated February 1998, ftp://ftp.isi.edu/in-notes/rfc1042.txt (cited 17 December 1999). 21 ITU Telecom Conference, “Backgrounder:Third Generation Mobile,” www.itu.int/telecom-wt-99/homepage.html (cited 24 November 1999). 22 Bradner (1999), 47

Setting Standards

19 of 22

In the case of computer code or software standards, the traditional bifurcation between standards as documentation and product as physical instantiation does not hold. Standards bodies can utilize the nature of code and standard to create a more open and dynamic process. The ability of any individual to offer a coded implementation of a standard offers an unofficial opportunity to influence the dynamics of the market. (Shapiro and Varian, 1999). Of course, the continuum between categorization (or implementation) and traditional standards has long been recognized. (Bowker and Star, 1999). Yet ICT standards bring this into particular relief, as code is speech (Lessig, 1999). Openness provides checks and feedback on security and privacy. Security and privacy is correlated with openness in the following chart. Participat ion

ITU National governments and corporations

Transpar ency

Agenda and Recommendations

Authority

Review Committee (membership) Open Specs RAND patents

Fee-paying members Open Specs Open implementations No licensing fees

Review Committee (membership) Open Specs From no licensing fees to RAND

Industry driven balanced by EU and government participation Standards often vague, subject to varied interpretations

Industry driven, little concern for privacy

Commitment to privacy in principle

Implementation promotes interoperability

Openness promotes interoperability Standards subject to interpretation

Openness

Privacy

Interoper ability

W3C Paying corporate and selected academic members Agenda and Periodic working drafts

IEEE Fee paying Engineers

IETF All Interested Individuals

Agenda and standard

All working documents, meetings, and email lists IAB and Internet Society Open Specs Open implementation Broad range of patent polices Industry driven, little concern for privacy Implementation promotes interoperability

In addition to democratic adoptability, the choice of standards process has consequences with respect to technical openness, security, privacy, and interoperability. On each front, the four models presented are superior to proprietary standards. This is because of the dynamic interaction between security, openness and privacy (Camp, 1999). Open standards are more secure, have the potential to enhance privacy, and are more likely to be interoperable with future standards. As such, provided one of the models is consistent with the democratic requirements of public sector decision making in a specific context, government is best to choose a model that produces open standards. As this paper illustrates, however, openness is not a binary condition. In addition to open specifications, the W3C and IEEE SA models offer the user open implementations, while the IETF adds open access to all working documents and mailing list archives. Yet the IETF is least open with respect to patents addressed by each standard. Clearly the IETF is no panacea. The IEEE SA and the ITU must

Setting Standards

20 of 22

adopt to new realities of code, particularly in allowing open code to thrive. The W3C and the IETF are converging into similar industry consortia. A useful generalization is that the W3C and the IETF quickly respond to needs of an evolving marketplace while the ITU and IEEE SA enable the creation of new markets, and have been unable to respond to extant and quickly changing markets. The table above illustrates that all bodies have specific strengths unmatched by the others. Depending on the state of the industry in question, the maturity of the market and the nature of the standard different bodies are appropriate. The IETF will not be eventually taken over by the ITU, and similarly the IEEE-SA and WC3 will continue their complementary practices. Bibliography James Besen, "What Good is Free Software" in William Hahn Government Policy Towards Open Source , Brookings Institute, 2003. pp. 12-33. Bradner, S. “The Internet Standards Process: Revision 3.” Updated October 1996, ftp://ftp.isi.edu/in-notes/rfc2026.txt (cited 10 January 2000). Branscomb, Lewis and Keller, James H. Converging Infrastructures. Cambridge MA: MIT Press, 1996. Geoffrey C. Bowker & Susan Leigh Star, Sorting Things Out: Classification And Its Consequences, MIT Press 1999 Camp, Jean. Trust and Risk in Internet Commerce. October 1999. MIT Press, Cambridge, MA. draft available http://www.ljean.org/trustRisk Chaum, D., 1992, “Achieving electronic privacy,” Scientific American, Vol. 267, pp. 76-81. Compaine, Benjamin M. Issues in New Information Technology. Norwood, NJ: Ablex Publishing, 1988. Computer Professionals for Social Responsibility. “Some Frequently Asked Questions About Data Privacy and P3P.” updated 7 November 1999 http://www.cpsr.org/program/privacy/p3p-faq.html (cited 21 December 1999). DiBona, Chris, Ockman, Sam and Stone, Mark. Open sources: Voices from the Open Source Revolution. Cambridge, MA: O’Reilly & Associates, 1999. Etzioni, Amitai. The Limits of Privacy. New York: Basic Books, 1999. Garfinkel, Simson. “Fiascos.” Wide Open News. 14 November 1999,

Setting Standards

21 of 22

Institute of Medicine, Committee on Improving the Patient Record Division of Health Care Services. The Computer-Based Patient Record: An Essential Technology for Health Care. Washington D.C.: National Academy Press, 1991. Internet Architecture Board and Internet Engineering Steering Group, "The Internet Standards Process -- Revision 2", RFC 1602, IAB, IESG, March 1994. ITU Telecom Conference. Backgrounder: Third Generation Mobile. www.itu.int/telecom-wt99/homepage.html (cited 24 November 1999). International Telecommunications Union, 1999. Kahin, Brian and Keller James H. Coordinating the Internet. Cambridge, MA: MIT Press, 1997. Kaner, Cem, and Pels, David. Bad Software: What to Do When Software Fails. New York: Wiley Computer Publishing, 1998. League for Programming Freedom, “Against Software Patents ”, Communications of the ACM, 1992: 35, July. http://www.cise.ufl.edu/~gmh/ethics/patent/against-software-patents.html . Lee, “Perspectives on the Future of Internet Navigation”, Meeting of the Committee on Internet Navigation and the Domain Name System: Technical Alternatives and Policy Implications. February 28, 2002. Harvard. Lessig, L. Code and Other Laws of Cyberspace, Cambridge, MA: Basic Books, 1999. Markoff, John. “When Privacy is more perilous than the lack of it.” New York Times, 4 April 1999, 4:3. MITRE, Use of Free and Open Source Software in the US Department of Defense, January 2, 2003. MITRE, New Bedford, MA. National Research Council, Computer Science and Telecommunications Board. For the Record: Protecting Electronic Health Information. Washington, D.C.: National Academy Press, 1997. National Research Council, System Security Study Committee. Computers at Risk. Washington D.C.: National Academy Press, 1991. Neumann, Peter G. Computer-Related Risks. New York: Addison-Wesley, 1995. Postel J. and Reynolds J. “A Standard for the Transmission of IP Datagrams over IEEE 802 Networks.” updated February 1998, ftp://ftp.isi.edu/in-notes/rfc1042.txt (cited 17 December 1999). Shah, R. C., & Kesan, J. P. (Forthcoming). Nurturing Software: How Societal Institutions Shape the Development of Software. Communications of the ACM.

Setting Standards

22 of 22

http://papers.ssrn.com/sol3/papers.cfm?abstract_id=519024 Cargill, C., Information Technology Standardization: Theory, Process, and Organization Digital Press, 1989. Hawkins, R. “Standards for communication technologies: negotiating institutional biases in network design”. in R. Mansell & R. Silverstone (Eds.), Communication by design (pp. 157-186). Oxford: Oxford University Press, 1998. Pincus, A. J., “The role of standards in growth of the global electronic commerce: United States Senate, Committee On Commerce, Science And Technology”, Subcommittee On Science, Technology And Space, 1999. Retrieved March 15, 2004, from http://www.ogc.doc.gov/ogc/legreg/testimon/106f/pincus1028.htm Schmidt, S. K., & Werle, R. Coordinating technology: Studies in the international standardization of telecommunications. Cambridge, MA: MIT Press, 1998. Shapiro and Hal Varian, Information Rules, Harvard Business Press, Cambridge, MA. 1999. Weiss, M., & Cargill, C. Consortia in the standards development process. Journal of the American Society for Information Science, 1992: 43(8), 559-565. Andrew Updegrove, Standard Setting and Consortium Structures, Standard View, Dec. 1995: 143, 144. David & Mark Shurmer, “Formal Standards-Setting for Global Telecommunications and Information Services”, Telecommunications Policy, 1996:20 (789)