Home → Magazine Archive → October 2015 (Vol. 58, No. 10) → Computing Is History → Abstract

Computing Is History

By Thomas J. Misa

Communications of the ACM, Vol. 58 No. 10, Pages 35-37
10.1145/2814845

[article image]


With cloud, big data, supercomputing, and social media, it's clear that computing has an eye on the future. But these days the computing profession also has an unusual engagement with history. Three recent books articulating the core principles or essential nature of computing place the field firmly in history. Purdue University has just published an account of its pioneering effort in computer science.4 Boole, Babbage, and Lovelace are in the news, with bicentennial celebrations in the works. Communications readers have been captivated by a specialist debate over the shape and emphasis of computing's proper history.a And concerning the ACM's role in these vital discussions, our organization is well situated with an active History Committee and full visibility in the arenas that matter.

Perhaps computing's highly visible role in influencing the economy, reshaping national defense and security, and creating an all-embracing virtual reality has prompted some soul searching. Clearly, computing has changed the world—but where has it come from? And where might it be taking us? The tantalizing question whether computing is best considered a branch of the mathematical sciences, one of the engineering disciplines, or a science in its own right remains unsolved. History moves to center stage according to Subrata Dasgupta's It Began with Babbage: The Genesis of Computer Science.1 Dasgupta began his personal engagement with history in conversation with Maurice Wilkes and David Wheeler. Babbage, Lovelace, Hollerith, Zuse, Aiken, Turing, and von Neumann, among others, loom large in his pages.

0 Comments

No entries found