Home → Magazine Archive → April 2016 (Vol. 59, No. 4) → Why Logical Clocks Are Easy → Abstract

Why Logical Clocks Are Easy

By Carlos Baquero, Nuno PreguiƧa

Communications of the ACM, Vol. 59 No. 4, Pages 43-47

[article image]

back to top 

Any computing system can be described as executing sequences of actions, with an action being any relevant change in the state of the system. For example, reading a file to memory, modifying the contents of the file in memory, or writing the new contents to the file are relevant actions for a text editor. In a distributed system, actions execute in multiple locations; in this context, actions are often called events. Examples of events in distributed systems include sending or receiving messages, or changing some state in a node. Not all events are related, but some events can cause and influence how other, later events occur. For example, a reply to a received email message is influenced by that message, and maybe by prior messages received.

Events in a distributed system can occur in a close location, with different processes running in the same machine, for example; or at nodes inside a datacenter; or geographically spread across the globe; or even at a larger scale in the near future. The relations of potential cause and effect between events are fundamental to the design of distributed algorithms. These days hardly any service can claim not to have some form of distributed algorithm at its core.


CACM Administrator

The following letter was published in the Letters to the Editor of the July 2016 CACM (http://cacm.acm.org/magazines/2016/7/204035).
--CACM Administrator

I would like to congratulate Carlos Baquero and Nuno Preguia for their clear writing and the good examples they included in their article "Why Logical Clocks Are Easy" (Apr. 2016), especially on a subject that is not easily explained. I should say the subject of the article is quite far from my usual area of research, which is, today, formal methods in security. Still, we should reflect on Baquero's and Preguia's extensive use of the concept of "causality." That concept has been used in science since ancient Greece, where it was developed by the atomists, then further, to a great extent, by Aristotle, through whose writings it reached the modern world.

The concept of causality was criticized by David Hume in the 18th century. Commenting on Hume, Bertrand Russell (in his 1945 book A History of Western Philosophy) said, "It appears that simple rules of the form 'A causes B' are never to be admitted in science, except as crude suggestions in early stages." Much of modern science is built on powerful equations from which many causal relationships can be derived, the implication being the latter are only explanations or illustrations for the relationships expressed by the former.

Causal laws are not used in several well-developed areas of computer science, notably complexity theory and formal semantics. In them, researchers write equations or other mathematical or logical expressions. At one point in their article, Baquero and Preguia redefined causality in terms of set inclusion. Leslie Lamport's classic 1978 paper "Time Clocks, and the Ordering of Events in a Distributed System" (cited by Baquero and Preguia) seems to use the concept of causality for explanations, rather than for the core theory. In several papers, Joseph Y. Halpern and Judea Pearl have developed the concept of causality in artificial intelligence, but their motivations and examples suggest application to "early stage" research, just as Bertrand Russell wrote.

I submitted this letter mainly to prompt thinking on what role the causality concept should play in the progress of various areas of computer science. Today, it is used in computer systems, software engineering, and artificial intelligence, among others, probably. Should we thus aim for its progressive elimination, leaving it a role only in the area of explanations and early-stage intuitions?

Luigi Logrippo
Ottawa-Gatineau, Canada

Displaying 1 comment