Opinion
Computing Applications Historical reflections

We Have Never Been Digital

Reflections on the intersection of computing and the humanities.
Posted
  1. Introduction
  2. Rupture Talk and Imaginaires
  3. Enter "The Digital"
  4. Eroding the Future
  5. Digital Humanities
  6. Further Reading
  7. References
  8. Author
  9. Footnotes
Honeywell advertisement: 'What the heck is electronic mail?'

This column is inspired by the fashionable concept of the "digital humanities." That will be our destination rather than our starting point, as we look back at the long history of the idea that adoption of computer technology is a revolutionary moment in human history. Along the way we will visit the work of Nicholas Negroponte and Bruno Latour, whose books Being Digital and We Have Never Been Modern I splice to suggest that we have, in fact, never been digital.

The computer is not a particularly new invention. The first modern computer programs were run in 1948, long before many of us were born. Yet for decades it was consistently presented as a revolutionary force whose imminent impact on society would utterly transform our lives. This metaphor of "impact," conjuring images of a bulky asteroid heading toward a swamp full of peacefully grazing dinosaurs, presents technological change as a violent event we need to prepare for but can do nothing to avert.

Discussion of the looming revolution tended to follow a pattern laid out in the very first book on electronic computers written for a broad audience: Edmund Callis Berkeley’s 1949 Giant Brains: Or Machines That Think.1 Ever since then the computer has been surrounded by a cloud of promises and predications, describing the future world it will produce.

The specific machines described in loving detail by Berkeley, who dwelled on their then-novel arrangements of relays and vacuum tubes, were utterly obsolete within a few years. His broader hopes and concerns for thinking machines, laid out in chapters on "what they might do for man" and "how society might control them" remain much fresher. For example, he discussed the potential for autonomous lawnmowers, automated translation, machine dictation, optical character recognition, an "automatic cooking machine controlled by program tapes," and a system by which "all the pages of all books will be available by machine." "What," he asked, "shall I do when a robot machine renders worthless all the skills I have spent years in developing?"

Computer systems have always been sold with the suggestion they represent a ticket to the future. One of my favorite illustrations of this comes from 1953, when W.B. Worthington, a business systems specialist, promised at a meeting of his fellows that "the changes ahead appear to be similar in character but far beyond those effected by printing." At that point no American company had yet applied a computer to administrative work, and when they did the results would almost invariably disappoint. The machines needed more people than anticipated to tend them, took longer to get running, and proved less flexible. So why did hundreds of companies rush into computerization before its economic feasibility was established? Worthington had warned that "The first competitor in each industry to operate in milliseconds, at a fraction of his former overhead, is going to run rings around his competition. There aren’t many businesses that can afford to take a chance on giving this fellow a five-year lead. Therefore, most of us have to start now, if we haven’t started already."a


Computer systems have always been sold with the suggestion they represent a ticket to the future.


Following his belief that "the ominous rumble you sense is the future coming at us." Worthington was soon to give up his staff job at Hughes Aircraft in favor of a consulting role, promoting his own expertise as a guide toward the electronic future. He had promised that "We can set our course toward push-button administration, and God willing we can get there." Similar statements were being made on the pages of the Harvard Business Review and in speeches delivered by the leaders of IBM and other business technology companies as a broad social alliance assembled itself behind the new technology.

After this initial surge of interest in computerization during the 1950s there have been two subsequent peaks of enthusiasm. During the late 1970s and early 1980s the world was awash with discussion of the information society, post-industrial society, and the microcomputer revolution. There followed, in the 1990s, a wave of enthusiasm for the transformative potential of computer networks and the newly invented World Wide Web.

Back to Top

Rupture Talk and Imaginaires

Discussion of the "computer revolution" was not just cultural froth whipped up by the forces of technological change. Instead the construction of this shared vision of the future was a central part of the social process by which an unfamiliar new technology became a central part of American work life. Patrice Flichy called these collective visions "imaginaires" and has documented their importance in the rapid spread of the Internet during the 1990s.2 Rob Kling, a prolific and influential researcher, wrote extensively on the importance of "computerization movements" within organizations and professional fields.5

Historian of technology Gabrielle Hecht called such discussion "rupture talk" in her discussion of the enthusiasm with which France reoriented its colonial power and engineering talent during the 1950s around mastery of nuclear technology.4 This formulation captures its central promise: that a new technology is so powerful and far-reaching it will break mankind free of history. Details of the utopian new age get filled in according to the interests, obsessions, and political beliefs of the people depicting it. That promise is particularly appealing to nations in need of a fresh start and a boost of confidence, as France then was, but its appeal seems to be universal. This dismissal of the relevance of experience or historical precedent carries out a kind of preventative strike on those who might try to use historical parallels to argue that the impact of the technology in question might in fact be slower, more uneven, or less dramatic than promised. Yet this fondness for rupture talk is itself something with a long history around technologies such as electric power, telegraphy, air travel, and space flight.

Back to Top

Enter "The Digital"

One of the most interesting of the cluster of concepts popularized in the early 1990s to describe the forthcoming revolution was the idea of "the digital" as a new realm of human experience. Digital had, of course, a long career as a technical concept within computing. It began as one of the two approaches to high-speed automatic computation back in the 1940s. The new breed of "computing machinery," after which the ACM was named, was called digital because the quantities the computer calculated with were represented as numbers. That is to say they were stored as a series of digits, whether on cog wheels or in electronic counters, and whether they were manipulated as decimal digits or the 0s and 1s of binary. This contrasted with the better-established tradition of analog computation, a term derived from the word "analogy." In an analog device an increase in one of the quantities being modeled is represented by a corresponding increase in something inside the machine. A disc rotates a little faster; a voltage rises slightly; or a little more fluid accumulates in a chamber. Traditional speedometers and thermometers are analog devices. They creep up or down continuously, and when we read off a value we look for the closest number marked on the gauge.

Throughout the 1950s and 1960s analog and digital computers coexisted. The titles of textbooks and university classes would include the word "analog" or "digital" as appropriate to avoid confusion. Eventually the increasing power and reliability of digital computers and their falling cost squeezed analog computers out of the niches, such as paint mixing, in which they had previously been preferred. Most analog computer suppliers left the industry, although Hewlett-Packard made a strikingly successful transition to the digital world. By the 1970s it was generally no longer necessary to prefix computer with "digital" and consequently the word was less frequently encountered in computing circles.

"Digital" acquired a new resonance from 1993, with the launch of the instantly fashionable Wired magazine. In the first issue of Wired its editor proclaimed the "the Digital Revolution is whipping through our lives like a Bengali typhoon," just as enthusiasm was building for the information superhighway and the Internet was being opened to commercial use. Wired published lists of the "Digerati"—a short-lived coinage conservative activist and prophet of unlimited bandwidth George Gilder used to justify something akin to People‘s list of the sexiest people alive as judged on intellectual appeal to libertarian techno geeks. The magazine’s title evoked both electronic circuits and drug-heightened fervor. As Fred Turner showed in his book From Counter Culture to Cyberculture, Wired was one in a series of bold projects created by a shifting group of collaborators orbiting libertarian visionary Steward Brand.8 Brand had previously created the Whole Earth Catalog back in the 1960s and a pioneering online community known as the WELL (Whole Earth ‘Lectronic Link) in the 1980s. His circle saw technology as a potentially revolutionary force for personal empowerment and social transformation. In the early 1990s this held together an unlikely alliance, from Newt Gingrich who as House Speaker suggested giving laptops to the poor rather than welfare payments, to the futurist Alvin Toffler, U.S. Vice President Al Gore who championed government support for high-speed networking, and Grateful Dead lyricist John Perry Barlow who had founded the Electronic Frontier Foundation to make sure that the new territory of "cyberspace" was not burdened by government interference.

One of the magazine’s key figures, Nicholas Negroponte, was particularly important in promoting the idea of "the digital." Negroponte was the entrepreneurial founder and head of MIT’s Media Lab, a prominent figure in the world of technology whose fame owed much to a book written by Brand. Negroponte took "digital" far beyond its literal meaning to make it, as the title of his 1995 book Being Digital, suggested, the defining characteristic of a new way of life. This was classic rupture talk. His central claim was that in the past things "made of atoms" had been all important. In the future everything that mattered would be "made of bits."

As I argued in a previous column, all information has an underlying material nature.3 Still, the focus on digital machine-readable representation made some sense: the computer is an exceptionally flexible technology whose applications gradually expanded from scientific calculation to business administration and industrial control to communication to personal entertainment as their speed has risen and their cost fallen. Each new application meant representing a new aspect of the world in machine-readable form. Likewise, the workability of modern computers depended on advances in digital electronics and conceptual developments in coding techniques and information theory. So stressing the digital nature of computer technology is more revealing than calling the computer an "information machine."

Here is a taste of Being Digital: "Early in the next millennium, your left and right cuff links or earrings may communicate with each other by low-orbiting satellites and have more computer power than your present PC. Your telephone won’t ring indiscriminately; it will receive, sort, and perhaps respond to your calls like a well-trained English butler. Mass media will be refined by systems for transmitting and receiving personalized information and entertainment. Schools will change to become more like museums and playgrounds for children to assemble ideas and socialize with children all over the world. The digital planet will look and feel like the head of a pin. As we interconnect ourselves, many of the values of a nation-state will give way to those of both larger and smaller communities. We will socialize in digital neighborhoods in which physical space will be irrelevant and time will play a different role. Twenty years from now, when you look out of a window what you see may be five thousand miles and six time zones away …"


A wave of enthusiasm for "the digital" has swept through humanities departments worldwide.


Like any expert set of predictions this cluster of promises extrapolated social and technology change to yield a mix of the fancifully bold, the spot-on, and the overly conservative. Our phones do support call screening, although voice communication seems to be dwindling. Online communities have contributed to increased cultural and political polarization. Netflix, Twitter, blogs, and YouTube have done more than "refine" mass media.

As for those satellite cuff links, well the "Internet of Things" remains a futuristic vision more than a daily reality. As the career of the "cashless society" since the 1960s has shown, an imaginaire can remain futuristic and exciting for decades without ever actually arriving.b However, when the cuff links of the future do feel the need to communicate they seem more likely to chat over local mesh networks than precious satellite bandwidth. This prediction was perhaps an example of the role of future visions in promoting the interests of the visionary. Negroponte was then on the board of Motorola, which poured billions of dollars into the Iridium network of low-earth orbit satellites for phone and pager communication. That business collapsed within months of launch in 1998 and plans to burn up the satellites to avoid leaving space junk were canceled only after the U.S. defense department stepped in to fund their continued operation.

Back to Top

Eroding the Future

Of course we never quite got to the digital future. My unmistakably analog windows show me what is immediately outside my house. Whether utopian or totalitarian, imagined future worlds tend to depict societies in which every aspect of life has changed around a particular new technology, or everyone dresses in a particular way, or everyone has adopted a particular practice. But in reality as new technologies are assimilated into our daily routines they stop feeling like contact with an unfamiliar future and start seeming like familiar objects with their own special character. If a colleague reported that she had just ventured into cyberspace after booking a hotel online or was considering taking a drive on the information superhighway to send email you would question her sincerity, if not her sanity. These metaphors served to bundle together different uses of information technology into a single metaphor and distance them from our humdrum lives. Today, we recognize that making a voice or video call, sending a tweet, reading a Web page, or streaming a movie are distinct activities with different meanings in our lives even when achieved using the same digital device.

Sociologist Bruno Latour, a giant in the field of science studies, captured this idea in the title of his 1993 book We Have Never Been Modern, published just as Negroponte began to write his columns for Wired. Its thesis was that nature, technology, and society have never truly been separable despite the Enlightenment and Scientific Revolution in which their separation was defined as the hallmark of modernity. Self-proclaimed "moderns" have insisted vocally on these separations while in reality hybridizing them into complex socio-technical systems. Thus, he asserts "Nobody has ever been modern. Modernity has never begun. There has never been a modern world."6

Latour believed that "moderns," like Negroponte, see technology as something external to society yet also as something powerful enough to define epochs of human existence. As Latour wrote, "the history of the moderns will be punctuated owing to the emergence of the nonhuman—the Pythagorean theorem, heliocentrism … the atomic bomb, the computer. … People are going to distinguish the time ‘BC’ and ‘AC’ with respect to computers as they do the years ‘before Christ’ and ‘after Christ’."

He observed that rhetoric of revolution has great power to shape history, writing that "revolutions attempt to abolish the past but they cannot do so …" Thus we must be careful not to endorse the assumption of a historical rupture as part of our own conceptual framework. "If there is one thing we are incapable of carrying out," Latour asserted, "it is a revolution, whether it be in science, technology, politics, or philosophy…."

Our world is inescapably messy, a constant mix of old and new in every area of culture and technology. In one passage Latour brought things down to earth by discussing his home repair toolkit: "I may use an electric drill, but I also use a hammer. The former is 35 years old, the latter hundreds of thousands. Will you see me as a DIY expert ‘of contrasts’ because I mix up gestures from different times? Would I be an ethnographic curiosity? On the contrary: show me an activity that is homogenous from the viewpoint of the modern time."

According to science fiction writer William Gibson, "The future is already here—it’s just not very evenly distributed."c That brings me comfort as a historian because of its logical corollary, that the past is also mixed up all around us and will remain so.d Even Negroponte acknowledged the uneven nature of change. Back in 1997, in his last column for Wired, he noted that "digital" was destined for banality and ubiquity as "Its literal form, the technology, is already beginning to be taken for granted, and its connotation will become tomorrow’s commercial and cultural compost for new ideas. Like air and drinking water, being digital will be noticed only by its absence, not its presence."7

Back to Top

Digital Humanities

Even after once-unfamiliar technologies dissolve into our daily experience, rupture talk and metaphors of revolution can continue to lurk in odd and unpredictable places. While we no longer think of the Internet as a place called "cyberspace" the military-industrial complex seems to have settled on "cyber warfare" as the appropriate name for online sabotage. Likewise, the NSF has put its money behind the idea of "cyberinfrastructure." The ghastly practice of prefixing things with an "e" has faded in most realms, but "e-commerce" is hanging on. Like most other library schools with hopes of continued relevance my own institution has dubbed itself an "iSchool," copying the names of Apple’s successful consumer products. There does not seem to be any particular logic behind this set of prefixes and we might all just as well have settled on "iWarfare," "cybercommerce" and "e-school." But these terms will live on, vestiges of the crisp future vision that destroyed itself by messily and incompletely coming true.

The dated neologism I have been hearing more and more lately is "the digital humanities." When I first heard someone describe himself as a "digital historian" the idea that this would be the best way to describe a historian who had built a website seemed both pretentious and oddly outdated. Since then, however, a wave of enthusiasm for "the digital" has swept through humanities departments nationwide.

According to Matthew Kirschenbaum, the term "digital humanities" was first devised at the University of Virginia back in 2001 as the name for a mooted graduate degree program. Those who came up with it wanted something more exciting than "humanities computing" and broader than "digital media," two established alternatives. It spread widely through the Blackwell Companion to the Digital Humanities issued in 2004. As Kirschenbaum noted, the reasons behind the term’s spread have "primarily to do with marketing and uptake" and it is "wielded instrumentally" by those seeking to further their own careers and intellectual agendas. In this humanists are not so different from Worthington back in the 1950s, or Negroponte and his fellow "digerati" in the 1990s, though it is a little incongruous that they appropriated "the digital" just as he was growing tired of it.

The digital humanities movement is a push to apply the tools and methods of computing to the subject matter of the humanities. I can see why young humanists trained in disciplines troubled by falling student numbers, a perceived loss of relevance, and the sometimes alienating hangover of postmodernism might find something liberating and empowering in the tangible satisfaction of making a machine do something. Self-proclaimed digital humanists have appreciably less terrible prospects for employment and grant funding as a humanist than the fusty analog variety. As Marge Simpson wisely cautioned, "don’t make fun of grad students. They just made a terrible life choice."

It is not clear exactly what makes a humanist digital. My sense is the boundary shifts over time, as one would have to be using computers to do something that most of one’s colleagues did not know how to do. Using email or a word processing program would not qualify, and having a homepage will no longer cut it. Installing a Web content management system would probably still do it, and anything involving programming or scripting definitely would. In fact, digital humanists have themselves been arguing over whether a humanist has to code to be digital, or if writing and thinking about technology would be enough. This has been framed by some as a dispute between the virtuous modern impulse to "hack" and the ineffectual traditional humanities practice of "yack."

As someone who made a deliberate (and economically rather perverse) choice to shift from computer science to the history of technology after earning my first masters’ degree, I find this glorification of technological tools a little disturbing. What attracted me to the humanities in the first place was the promise of an intellectual place where one could understand technology in a broader social and historical context, stepping back from the culture of computer enthusiasm that valued coding over contemplating and technological means over human ends.

There is a sense in which historians of information technology work at the intersection of computing and the humanities. Certainly we have attempted, with rather less success, to interest humanists in computing as an area of study. Yet our aim is, in a sense, the opposite of the digital humanists: we seek to apply the tools and methods of the humanities to the subject of computing (a goal shared with newer fields such as "platform studies" and "critical code studies"). The humanities, with their broad intellectual perspective and critical sensibility, can help us see beyond the latest fads and think more deeply about the role of technology in the modern world. Social historians have done a great job examining the history of ideas like "freedom" and "progress," which have been claimed and shaped in different ways by different groups over time. In the history of the past 60 years ideas like "information" and "digital" have been similarly powerful, and deserve similar scrutiny. If I was a "digital historian," whose own professional identity and career prospects came from evangelizing for "the digital," could I still do that work?

There are many ways in which new software tools can contribute to teaching, research, and dissemination across disciplines, but my suspicion is that the allure of "digital humanist" as an identity will fade over time. It encompasses every area of computer use (from text mining to 3D world building) over every humanities discipline (from literary theory to classics). I can see users of the same tools in different disciplines finding an enduring connection, and likewise users of different tools in the same discipline. But the tools most useful to a particular discipline, for example the manipulation of large text databases by historians, will surely become part of the familiar scholarly tool set just as checking a bank balance online no longer feels like a trip into cyberspace. Then we will recognize, to adapt the words of Latour, that nobody has ever been digital and there has never been a digital world. Or, for that matter, a digital humanist.

Back to Top

Further Reading

Gold, M.K., Ed.
Debates in the Digital Humanities, University of Minnesota Press, 2012. Also at http://dhdebates.gc.cuny.edu/. Broad coverage of the digital humanities movement, including its history, the "hack vs. yack" debate, and discussion of the tension between technological enthusiasm and critical thinking.

Gibson, W.
Distrust that Particular Flavor, Putnam, 2012. A collection of Gibson’s essays and nonfiction, including his thoughts on our obsession with the future.

Latour, B.
Science in Action: How to Follow Scientists and Engineers through Society. Harvard University Press, 1987 and B. Latour and S. Woolgar, Laboratory Life: The Construction of Scientific Facts. Princeton University Press, 1986. We Have Never Been Modern is not the gentlest introduction to Latour, so I suggest starting with one of these clearly written and provocative studies of the social practices of technoscience.

Marvin, C.
When Old Technologies Were New: Thinking About Electric Communication in the Late Nineteenth Century. Oxford University Press, 1988. The hopes and fears attributed to telephones and electrical light when they were new provide a startlingly close parallel with the more recent discourse around computer technology.

Morozov, E.
To Save Everything, Click Here, Perseus, 2013. A "digital heretic" argues with zest against the idea of the Internet as a coherent thing marking a rupture with the past.

Winner, L.
The Whale and the Reactor: A Search for Limits in an Age of High Technology. University of Chicago Press, 1986. A classic work in the philosophy of technology, including a chapter "Mythinformation" probing the concept of the "computer revolution."

Back to Top

Back to Top

Back to Top

    1. Berkeley, E.C. Giant Brains or Machines That Think. Wiley, NY, 1949.

    2. Flichy, P. The Internet Imaginaire. MIT Press, Cambridge, MA, 2007.

    3. Haigh, T. Software and souls; Programs and packages. Commun. ACM 56, 9 (Sept. 2013), 31–34.

    4. Hecht, G. Rupture-talk in the nuclear age: Conjugating colonial power in Africa. Social Studies of Science 32, 6 (Dec. 2002).

    5. Kling, R. Learning about information technologies and social change: The contribution of social informatics. The Information Society 16, 3 (July–Sept. 2000), 217–232.

    6. Latour, B. We Have Never Been Modern. Harvard University Press, Cambridge, MA, 1993.

    7. Negroponte, N. Beyond digital. Wired 6, 12 (Dec. 1998).

    8. Turner, F. From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. University of Chicago Press, Chicago, 2006.

    a. W.B. Worthington. "Application of Electronics to Administrative Systems," Systems and Procedures Quarterly 4, 1 (Feb. 1953), 8–14. Quoted in T. Haigh, "The Chromium-Plated Tabulator: Institutionalizing an Electronic Revolution, 1954–1958," IEEE Annals of the History of Computing 23, 4 (Oct.–Dec. 2001), 75–104.

    b. A phenomenon I explore in more detail in B. Batiz-Lazo, T. Haigh, and D. Steans, "How the Future Shaped the Past: The Case of the Cashless Society," Enterprise and Society, 36, 1 (Mar. 2014), 4–17.

    c. The sentiment is Gibson's, although there is no record of him using those specific words until after they had become an aphorism. See http://quoteinvestigator.com/2012/01/24/future-has-arrived/.

    d. Gibson himself appreciates this, as I have discussed elsewhere T. Haigh, "Technology's Other Storytellers: Science Fiction as History of Technology," in Science Fiction and Computing: Essays on Interlinked Domains, D.L. Ferro and E.G. Swedin, Eds., McFarland, Jefferson, N.C., 2011, 13–37

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More