The Inheritance of the Digital: Ethnographic Approaches to Everyday Realities In, Of, and Through Digital Technologies.

Author:Howard, Robert Glenn

Inheritance of the Digital

In 1975, thirty-two computer hobbyists met in a garage in what would become California's Silicon Valley. This "HomeBrew Computer Club" imagined a future Utopia of individually owned computers that would grant everyone access to the technologies that were, at that time, so expensive and technical only institutions could afford them. Club member Bill Gates developed "software" while other members, Steve Jobs and Steve Wozniak, developed the "personal computer." Together they started the digital revolution that would emphasize individual access to information through small and inexpensive individually owned devices (Howard 2012; Wozniak 1984).

In 1977, the U.S. military successfully sent "packets" of on-and-off power fluctuations between computers. Their project was born of a different vision. They wanted a distributed communication system that could survive the imagined nuclear battlefields of the Cold War. The computer code they used not only made it possible to connect computers to each other but it, more importantly, allowed networks to be "internetted" together so long as they adhered to the accepted protocol. That institutionally authorized protocol, Transmission Control Protocol/Internet Protocol or "TCP/IP", is still the basis of all digital networks today (Abbate 1999:130-3). Imagining a digital age that would be dominated by large institutional computing networks, TCP/IP would be the bridge through which these institutional networks could communicate even when other means of communication had failed.

Born of the unlikely coupling of these two very different intentions, the devices that keep us continually networked together today are the inheritance of both a vision of individual freedom and a vision of bomb-proof institutional power. With this dual ideology, a shift in the cultural meaning of information technology occurred, and participatory media became locations for the emergence of diverse, hybrid, and even conflicting voices (Turner 2006). At the same time, however, the technologies that drive our everyday network devices are quietly embedding centralizing institutional interests in, at least, the forms of advertising, and surveillance. This dual heritage has come down to us today through the last 40 years of sustained development of network technologies.

Until the early 1990s, "internetting" was primarily an activity for institutional computers and trained computer engineers. In the '90s, an employee of the European science institute CERN created an innovative way for people to share information using TCP/IP. Inspired by the anti-institutional ethos of the HomeBrew computer club, Tim Berners-Lee built and gave away the first internet "browser" based on the cross-platform and very simple computer coding language Hypertext Markup Language or "HTML." So doing, he created what would come to be called the "Worldwide Web": a web of linked pages that were written in this computer code.

The next year, in 1992, the U.S. Congress passed a bill that allowed technology funded by the National Science Foundation or "NSF" to be used in for commercial purposes. Previously, all technologies produced from this U.S. government program could not be used in commerce. TCP/IP had been created by the NSF, so, as a result, it was now available for commercial applications. Seeing a new financial opportunity, a small startup called Mosaic Communications Corporation began searching for funding...

To continue reading

FREE SIGN UP