Opinion
Computing Applications

Computing For Humans

Posted
  1. Article
Communications Editor-in-Chief Moshe Y. Vardi

Gottfried Wilhelm Leibniz (1646–1716) has been called the "Patron Saint of Computing." Leibniz is most famous for the development—independently of Isaac Newton—of the infinitesimal calculus. (In fact, it is his mathematical notation we use today, rather than Newton’s.) He was also a prolific inventor of mechanical calculators and developed the binary number system.

Leibniz conceived of a universal mathematical language, lingua characteristica universalis, in which all human knowledge can be expressed, and calculational rules, calculus ratiocinator, carried out by machines to derive all logical relationships. Leibniz’s goal was nothing short of prophetic: "Once the characteristic numbers are established for most concepts, mankind will then possess a new instrument that will enhance the capabilities of the mind to a far greater extent than optical instruments strengthen the eyes."

This definition of computing, as an "instrument for the human mind," captures, I believe, the essence of our field. On one hand, our discipline is a technical one, focusing on hardware, software, and their theoretical foundations. On the other hand, the artifacts we build are meant to enhance the human mind. This duality of our field is witnessed by the two pioneers we lost last October: Steve Jobs and Dennis Ritchie, who passed away within a week of each other.

Dennis MacAlistair Ritchie (September 9, 1941–October 12, 2011) was the techies’ techie, as the creator of the C programming language and the codeveloper of the Unix operating system. The C language paved the way for C++ and Java, while Unix was the basis for many of today’s most widely used operating systems. Before the development of C and Unix, programming—especially systems programming—was tightly connected to the underlying hardware. C and Unix, in contrast, were highly portable. The citation for the 1983 Turing Award that Richie received together with Ken Thompson refers succinctly to "their development of generic operating systems theory." There is no computer scientist who is not familiar with C and Unix, but it is unlikely your cousin heard about them, unless she is also a computer scientist. Undoubtedly, however, your cousin is familiar with Steve Jobs.

Steven Paul "Steve" Jobs (February 24, 1955–October 5, 2011) was the founder of Apple, NeXT, and Pixar. His death received a tremendous amount of worldwide news coverage and is addressed by three articles in this issue of Communications. It is hard to think of anyone in recent memory whose passing received so much global attention. This level of interest is by itself worthy of observation. As Jaron Lanier makes clear in his essay, "The Most Ancient Marketing," Jobs was very much not an engineer. In fact, the title of one of the many essays published in the wake of Jobs’ death is "Why Jobs Is No Edison." Yet, it is difficult to point to anyone who had as much impact on computing over the last 30 years as Jobs. In fact, as Genevieve Bell points out in her essay, "Life, Death, and the iPad: Cultural Symbols and Steve Jobs," his impact goes beyond the world of computing, well into the realm of culture. (For a discussion of Job’s business strategy, see Michael A. Cusumano’s "Technology Strategy and Management."

Undoubtedly, Jobs’ uniqueness was his relentless and singular focus on the human side of computing. To start with, the Apple and Apple II were personal computers, and the Mac’s claim to fame was its user interface. The sequence of products that revolutionized computing over the past 10 years, the iPod, iPhone, and iPad, was unique in its focus on user experience. In fact, the very term "user experience" was coined at Apple in the mid-1990s. The success of Apple’s products over the past decade made this term quite fashionable lately.

Yet the user has not always been and is probably still not at the center of our discipline. A quick perusal of ACM’s Special Interest Groups shows that their general focus tends to be quite technical. In fact, one often encounters among computing professionals an attitude that regards the field of human-computer interaction as "soft," implying it is less worthy than the "harder" technical areas. In my own technical areas, databases and formal methods, I almost never encounter papers that pay attention to usability issues.

The almost simultaneous departure and Jobs and Ritchie should remind us of the fundamental duality of computing. As Leibniz prophesied, computing is "an instrument for the human mind." Let us keep the human in the center!

Moshe Y. Vardi, EDITOR-IN-CHIEF

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More