In the discussion of recent decades concerning the relationship between human cognition and machine computation, one perspective predominates—starting from the point of view of the computation, we probe the possible applications to cognition. That is, we describe cognition in computational terms. My own view, outlined in previous blog posts, is that computation fashions artifacts that the philosophy of computer science can view in the terms of human affairs and humanities perspectives. Hence, I interrogate the ontology of algorithms [Hill 2016a], ask whether nature uses data [Hill 2016b] and whether fiction is model theory [Hill 2016c].
Recently, in Communications, we've seen a couple of articles that elaborate the human-computer analogy from this latter point of view, sketching a computing phenomenon (narrower than computation itself) and then interpreting it in quotidian human affairs, as literally as possible. While these ideas are not new, the way these works pose the concepts aligns with my view—for which the authors cannot be held responsible, of course.
Peter Denning, in the Sept 2017 "Multitasking Without Thrashing," gives suggestions for more efficient human workflow [Denning 2017]. Thrashing is "too many tasks in progress at the same time" so that the effort of recalling to mind the state of a task when it was suspended exceeds the amount of progress that you make on the task before the next suspension.
Alvaro Videla, in the Oct 2017, "Metaphors We Compute By," suggests that programmers should select appropriate metaphors for meaning amplification, that is, for clear explanation and understanding [Videla 2017]. Among the examples he gives are paths through a graph as options for directions on a map, and distribution of data across a hierarchy of processors as gossip, or better yet, as epidemic.
Such comparisons are engaging and entertaining, and may be enlightening. Many programmers will have thought of examples.
To Nest or Not
When someone drops into Alice's office, and then she gets a text message, and then someone calls on her office phone, she is performing LIFO processing if each successive interruption puts the outstanding ones in a nested wait status. She is likely to take care of the phone call, then the text, and then return to the visitor. But she could impose a different and perhaps more courteous regimen, FIFO, in which she attends to the visitor, then checks the text message, and then returns the missed phone call. (The authors cited above mention this metaphor.)
When Bob tells his assistant where to get refreshments for the meeting, he may choose among several methods to pass the location parameter. If he specifies a bakery, he is passing by value; if he hands her a coupon that he got from a doughnut shop, he is passing by reference; if he sends her to the secretary for a recommendation, he is passing by name.
An abstract data type is a job description, consisting of structured resources and tasks to perform with them. A greedy algorithm for putting leftover foods in containers may be easy but might not get everything stored. Someone putting receipts in chronological order by date might switch between Insertion Sort and Selection Sort.
This is fun. We could go on all day. But what do these instances mean? What are they instances of, again? We find certain human tasks being performed according to algorithms that we had not "surfaced." What does that tell us about the universe, or about computation, or about ourselves? These are the questions of the humanities, and, in particular, of philosophy, which takes an interesting observation and runs with it.
Let's just sketch some possible answers to the question, "What do these instances tell us?" Are they instances of human behavior that models or imitates classic computing applications, or vice versa? And let's duck the overarching question of computation as a cognitive metaphor, staying within the scope of more particular phenomena, the applications mentioned.
- Computer apps came first. Computing techniques inspired our problem-solving methods. We have learned how to do sorting, searching, and memory paging from Turing Machines, in effect. This seems unpersuasive. Semaphores and locks existed before operating systems. Salesmen were travelling long before Travelling Salesman. So computing has simply articulated symbolic problem-solving methods for long-standing tasks that come with ready nomenclature, such as "greedy" and "lock."
- Human acts lead the way. Our own problem-solving methods drive our designs for computing techniques. The symbolic computational paradigm was created (by us) rather than discovered. When Turing, Post, and others developed definitions of computation early in the 20th century, they were following a path already laid down in their heads. This raises the tantalizing but obscure prospect of other paradigms of which we are not aware.
- Something else brings about both. Computing is universal, and Nature, or evolution, has cultivated in us a way of problem-solving based on primordial techniques. The computational paradigm was discovered. This raises the metaphysical question of where, outside of us and our artifacts, these procedures reside.
Is this the exercise taking place? (1) Identify a computational method or trick; (2) extract its forms or predicates; (3) carry those forms and predicates to daily life; (4) pass them over human acts until the right interface appears, and plug them in—there it is, nice and square. For alternative accounts, we can look to the various theories that challenge this procedural view. The phenomenological treatment of metaphor [Theodorou] holds that metaphor and other devices ground meaning well outside of discrete semantics, and are not subject to analysis into forms, predicates, or propositional claims. All of this merely hints at the active philosophical discussions underway on the ontology of abstractions, the epistemology of symbols, and other questions.
[Denning 2017] Denning, Peter. 2017. Multitasking Without Thrashing. Communications. 60:9, September 2017.
[Hill 2015a] Hill, Robin K. 2016. What an Algorithm Is. Philosophy & Technology 29:1. DOI:10.1007/s13347-014-0184-5.
[Hill 2016b] Hill, Robin K. 2016. Does Nature Use Data? July 18, 2016. Blog@CACM.
[Hill 2016c] Hill, Robin K. 2016. Fiction as Model Theory. December 30, 2016. Blog@CACM.
[Theodorou] Theodorou, S. Metaphor and Phenomenology. Internet Encyclopedia of Philosophy. Access date 18 November 2017. ISSN 2161-0002.
Videla, Alvaro. 2017. Metaphors We Compute By. Communications. 60:10, October 2017.
Robin K. Hill is adjunct professor in the Department of Philosophy, and in the Wyoming Institute for Humanities Research, of the University of Wyoming. She has been a member of ACM since 1978.