Ruling the Root.

AuthorZittrain, Jonathan
PositionBook Review

Ruling the Root, Milton L. Mueller, Cambridge, Mass.: MIT Press, 2002, 301 pages.

In the spring of 1998, the U.S. government told the Internet: Govern yourself. (1) This unfocused order--a blandishment, really, expressed as an awkward "statement of policy" by the Department of Commerce, carrying no direct force of law--came about because the management of obscure but critical centralized Internet functions was at a political crossroads.

In Ruling the Root, (2) Milton L. Mueller thoroughly documents the colorful history both before and after this moment of inflection, and gives a fair appraisal both of the interests at stake and of the ways in which those interests have influenced the course of that history. It is clear that he laments the domination of latter parts of the tale by the Internet Corporation for Assigned Names and Numbers ("ICANN"), a California nonprofit created expressly to answer the U.S. government's 1998 challenge. Mueller finds ICANN to be at best a rigid, bureaucratic clamp on the innovation that had previously marked the Internet space, and at worst an instrument of old-guard "corporatist" interests bent on maintaining (or replicating from other fields such as broadcast and telephone) artificial scarcity of resources for the purpose of concentration of control. The questions left open by Mueller's inquiry are important, and to understand both these and the depth of Mueller's frustration with ICANN, it helps to reflect on the unusual way in which coordination of the global Net came about.

The Net had developed as a means of sharing information among anyone who could hew to its protocols, and the authors of those protocols were computer scientists and engineers of various stripes, engaged in a loose collective enterprise funded without expectations of direct profit by their commercial and nonprofit employers, and by the U.S. National Science Foundation and Department of Defense. Loose, yes, but also with elements of self-governance: The engineers who worked on Net protocols came to know each other, to name themselves the Internet Engineering Task Force ("IETF"), to select leaders from among themselves, and to agree upon processes by which to reach closure on contested issues so that the network could continue to develop. (3) "Rough consensus and running code" (4) was the guiding creed: the latter provided an objective metric by which to evaluate competing technical protocols; the former provided a reminder that since many advances in a given area of internetworking required everyone using that subsystem to agree upon a particular protocol, less than unanimity on its details ought to be enough to constitute "agreement" for all concerned. Moving forward was more important than moving forward perfectly.

Mueller refines this account by noting tensions between the rank-and-file engineers of the IETF and those who emerged as their leaders. Among the alphabet soup of organizations involved in Internet engineering through the 1990s with overlapping sets of participants, the Internet Architecture Board ("IAB") and the Internet Society--intended in their own respective ways to serve as political or legal leadership for the informal IETF collective--quickly ran into internal challenges to their legitimacy generally, and methods of governing themselves specifically. (5) Mueller highlights here what he calls the "technical cadre's allergy to democratic methods and public accountability," (6) and suggests that this had a significant influence on ICANN's later formation, and he is right.

Democratic or not, the IETF standards process has been responsible for the spectacularly successful bedrock elements on which the Internet functions, ranging from core logical layer Transmission Control Protocol/Internet Protocol ("TCP/IP") networking to e-mail interchange to clock synchronization. Each of these reflects the desire to have all networked entities interacting according to the same protocols--enabling different brands and platforms of computers to communicate with each other without bulky translation software or gateways.

In addition, certain elements of the Internet created through IETF processes--most notably, its naming and numbering schemes, which suppose the existence of unique identifiers such as www.cnn.com and 128.36.0.19 distributed among various Internet users--require some form of central coordination, if only to ensure that uniqueness. Without central coordination, two different Net destinations might advertise themselves as possessing identical numeric or named identities, and data packets would be misrouted on their way across the Net's topography to a particular destination. The coordination could, in theory, take the simple form of a commonly shared list of those identifies already "taken" by someone, and indeed the "first-come, first-served" nature by which many domain names are available to registrants reflects that idea. But whoever maintains a common list typically has, as a technical matter, the ability simply to change the list--perhaps delisting a name whose registrant has fallen behind in yearly maintenance payments demanded by the list holder, or reassigning a name previously reserved for one registrant to another because that other has made a claim of right.

When a commons is large and the decent pastures well distributed across it, there's little reason for shepherds to fight for turf. But if there's any sense of scarcity--of the commons' contents generally, or its "good areas" specifically--there must be a way of forestalling or resolving a stampede. By 1997, the Internet's supply of "good" names was thought to be drying up. Worse, certain good names were thought to be in the wrong hands. As a Web presence became a near-necessity for large businesses, the realization that corporate marquee names such as avis.com and mcdonalds.com already had been reserved, first-come, first-served, by individuals with no connections to the Avis car rental company or McDonald's hamburger restaurants caused severe and consistent consternation among famous trademark holders.

How then to create new, fertile turf such as .biz and .info when corporate interests--including those who operated the Internet service providers ("ISPs") who together are the Internet's pathways--already had their hands full attempting to claim control over their corporate brand names within .com, .net, and .org?

Further, through a Byzantine tangle of cooperative agreements offered by the National Science Foundation ("NSF"), the purpose of which was to promote the development and use of networks, one company--Network Solutions--was by 1995 the registration and renewal authority for all domain names in .com, .net, and .org, sitting astride a booming and essentially monopolized business. Network Solutions also had become the technical operator of the "A root"--the "list of lists"--the authoritative set of pointers to all sublists of "second-level" domain names, i.e., those ending in .com, .au, and any other suffix. As Mueller documents, Network Solutions had made it clear to Jon Postel--a researcher and IETF leader who by acclamation among the original Net designers was...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT