How important are skills in computational thinking for computing app constructors and for computing users in general? If we can teach our children early on to smile, talk, write, read, and count through frequent and repetitive use of patterns in well-chosen examples, is it also possible for us, assuming we have the skills, to teach our children to construct computing applications? Do we need to first teach them anything about computational thinking before we look to teach how to construct computing apps? If not, how important will computational skills be for us all, as Jeannette Wing suggests in her blog@cacm "Computational Thinking, 10 Years Later" (Mar. 2016 and reprinted on p. 10) and Viewpoint column "Computational Thinking" (Mar. 2006)?
Many competent and successful computing app constructors and users never hear a word about computational thinking but still manage to acquire sufficient construction and user skills through frequent and repetitive use of patterns in well-chosen examples. Can computing app constructors learn their skills from frequent and repetitive use of the patterns in well-chosen examples without computational thinking? Current results, as reflected in app stores, seem to show they can.
What specific skills define computational thinking? Do constructors learn them only through frequent and repetitive use of patterns in well-chosen examples? Time will tell. Does this mean skills in computational thinking are of little or no value? Not at all. But it does mean they need to be learned the same way skills in smiling, talking, writing, reading, and counting are learnedat the appropriate time.
What if experts would first have to demonstrate their own practical skills in computational thinking before they could promote the theoretical virtues of such disciplines? Would it avoid the dangers of the less computational savvy among us being led astray by well-intentioned fervor?
What if Jeannette Wing would publish the steps she uses to identify, specify, design, and code, say, a phone or tablet app to display an album of photos and supporting text? Who would benefit most? Would it be only Microsoft employees or many more of us constructors and users out in the world? Would we even learn the skills of computational thinking?
All practice and no theory? No, practice and theory. But practice first please, then theory.
Fred Botha, Sun City West, AZ
I commend Kate Matsudaira's wonderful article "Delegation as Art" (May 2016), with its spot-on guidance for addressing challenges in mentoring and delegating our co-workers and students; like Matsudaira, I love to imagine my teammates asking themselves, "What would Geoff say?" Matsudaira also did a superb job explaining her suggestions in ways that make them applicable to disciplines well beyond engineering.
However, one key challenge in managing software engineers Matsudaira did not address is that senior engineers are often expected to mentor team members while simultaneously being responsible for delivering the very projects on which the mentees are working. Matsudaira did say mentoring and delegation require letting people find their own way and make mistakes, even as project success is often measured by speed of delivery and perfection. As a result, mentoring success sometimes comes at the expense of project success, and vice versa.
It would benefit us all if our managers and project managers had a better understanding of the value and process of mentoring. To this end, I will be sharing the article with fellow leaders in my organization and recommend you share it with yours as well.
Geoffrey A. Lowney, Issaquah, WA
Computer science as a discipline has been remarkably bad at marketing itself. With current record enrollments in American universities, this is not a problem, but consider how it might play out in an uncertain future. People today using computers, cellphones, tablets, and automated highway toll payment devices, as well as multiple websites and services on a daily basis lack a clear idea what computer scientists actually do, and that they are indeed professionals, like lawyers, accountants, and medical doctors.
Parents concerned about the earning potential of their children and of their children's future spouses forever try to address the conundrum of what academic path to take, as in, say, medical school, law school, or business school.
I thus propose a simple semantic change, and the far-reaching organizational change it implies. Master's programs in computer science shall henceforth graduate "computists" and rebrand themselves as "computing schools." Future parents and parents-in-law should be able to choose among, say, medical school, computing school, business school, and law school. Existing Master's curricula in computer science would be extended to five or six semesters, with mandatory courses in all areas of applied computing, from the perspectives of building systems and selecting, evaluating, adapting, and applying existing systems in all facets of computing, based on scientific methods and precise metrics.
When this basic framework is established, computing schools would stop accepting the Graduate Records Exam (http://www.ets.org/gre) and switch to their own dedicated entrance exam, in the same way the LSAT is used for law school and the MCAT for medical school. The conferred degree would be called, say, Professional Master of Computing.
It might take a decade or more to change the perception of the American public, but such change is essential and should be embraced as quickly as possible and on a national basis. ACM is best positioned and able to provide the leadership needed to move this important step forward for the overall discipline of computer science.
James Geller, Newark, NJ
I would like to congratulate Carlos Baquero and Nuno Preguiça for their clear writing and the good examples they included in their article "Why Logical Clocks Are Easy" (Apr. 2016), especially on a subject that is not easily explained. I should say the subject of the article is quite far from my usual area of research, which is, today, formal methods in security. Still, we should reflect on Baquero's and Preguiça's extensive use of the concept of "causality." That concept has been used in science since ancient Greece, where it was developed by the atomists, then further, to a great extent, by Aristotle, through whose writings it reached the modern world.
The concept of causality was criticized by David Hume in the 18th century. Commenting on Hume, Bertrand Russell (in his 1945 book A History of Western Philosophy) said, "It appears that simple rules of the form 'A causes B' are never to be admitted in science, except as crude suggestions in early stages." Much of modern science is built on powerful equations from which many causal relationships can be derived, the implication being the latter are only explanations or illustrations for the relationships expressed by the former.
Causal laws are not used in several well-developed areas of computer science, notably complexity theory and formal semantics. In them, researchers write equations or other mathematical or logical expressions. At one point in their article, Baquero and Preguiça redefined causality in terms of set inclusion. Leslie Lamport's classic 1978 paper "Time Clocks, and the Ordering of Events in a Distributed System" (cited by Baquero and Preguiça) seems to use the concept of causality for explanations, rather than for the core theory. In several papers, Joseph Y. Halpern and Judea Pearl have developed the concept of causality in artificial intelligence, but their motivations and examples suggest application to "early stage" research, just as Bertrand Russell wrote.
I submitted this letter mainly to prompt thinking on what role the causality concept should play in the progress of various areas of computer science. Today, it is used in computer systems, software engineering, and artificial intelligence, among others, probably. Should we thus aim for its progressive elimination, leaving it a role only in the area of explanations and early-stage intuitions?
Luigi Logrippo, Ottawa-Gatineau, Canada
Communications welcomes your opinion. To submit a Letter to the Editor, please limit yourself to 500 words or less, and send to email@example.com.
©2016 ACM 0001-0782/16/07
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from firstname.lastname@example.org or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2016 ACM, Inc.
No entries found