We’ve all contributed to the crisis in computer science described in Maria Klawe’s and Ben Shneiderman’s "Viewpoint" ("Crisis and Opportunity in Computer Science," Nov. 2005). Instead of pushing the CS frontier with new discovery, we’ve acquiesced, becoming a "mature" science while turning the applied-science productivity crank.
For 50 years, we’ve built programs that put legions of telephone operators, drafters, clerks, and eventually managers out of work. We must have believed we ourselves were bulletproof, even as we built tools to enhance coder productivity that reduced demand for our skills. We packaged our experience into reusable libraries and published methodologies that devalued that experience. We simplified programming to remove barriers to entry for low-skilled and newly trained workers in low-wage countries. None of us should be surprised at the result.
Attracting more students to CS won’t restore the demand we’ve killed through the relentless pursuit of productivity. Increasing the supply of devalued labor and embedding CS into other disciplines only worsens the situation. To the extent we succeed in distilling a four-year curriculum into a few courses, we devalue our skills and experience further still.
To escape the crisis, we must resist the notion that all things computational have been discovered, doing original new research that expands our science and increases demand for our work. We must apply computation in new ways to support, rather than replace, non-routine and creative thought, so endeavors that were once too complex, abstract, or uncertain become tractable. We must lift knowledge workers, including ourselves, to higher levels of performance at the same time we nibble away at their (and our) routine tasks. If we fail, automation will overtake us, too. We will make human endeavor obsolete, without removing the need to eat.
Kurt Guntheroth
Seattle, WA
Maria Klawe and Ben Shneiderman (Nov. 2005) addressed the crisis in CS by proposing to broaden the field’s role in more areas. They could have made their argument more interesting by mentioning Edsger Dijkstra’s idea of "radical novelty" and explaining the importance of preventing CS from staying the same type of mental activity it was four decades ago.
Historical examples of radical novelty include: constructing digital computers (von Neumann architecture); discovering abstract machinery with both textual and mechanical states (algorithms/programs); impersonating an abstract machine with another abstract machine (procedural/recursive calls); multiplexing processors (time sharing/concurrency); adopting context-free grammar to describe programming languages (Algol 60); and computer programming as an art (literate programming). The result is not only practical knowledge and social and economic benefit but a new way of thinking about the field.
To address the crisis, we must look beyond CS even as we seek the answer deeper within it.
Andy K.Y. Poon
Toronto, Canada
Testing Wiki Credibility
The "Inside Risks" column "Wikipedia Risks" by Peter Denning et al. (Dec. 2005) was off base. Even the best-edited and most-authoritative encyclopedia includes errors. The proper test of Wikipedia content is how it is used over time. One problem with the suggested improvement to the Wiki process—including more formal content and expert review procedures—is at least as old as Imperial Rome where people asked quis custodes ipsos custodes?, or "who watches the watchmen?" The current Wiki approach to adding content is based on the dual foundations of caveat emptor, or "let the buyer beware," and the Oracle at Delphi. Bravo Wikipedia.
John Leary
Roanoke, VA
No One Programs from Scratch
In his "Viewpoint" ("Academic Dishonesty and the Internet," Oct. 2005), Kenneth A. Ross did not fully distinguish the idea of dishonesty from the idea of the efficient accomplishment of some objective. For most commercial purposes, we would rather hire someone who might contract out software development, producing the result more quickly and more professionally.
Ross defined the idea of disobedience but not of dishonesty yet punished disobedience as if it was dishonesty. The fact that his academic department had to update its academic honesty policy represents strong evidence that its prior policy had not been violated.
Instead of making their policies more stringent, academic departments must weigh the realities of programming, including that none of us programs anything from scratch; we cut and paste everything together from the available parts. Calling that dishonest would lead to calling the entire software industry dishonest. Calling it dishonest might even be viewed as failing to prepare students for jobs in the modern software industry.
Mike Brenner
Piscataway, NJ
Emphasize Positive Feedback in Personalization
Extending the lucid review of personalization techniques in Gediminas Adomavicius’s and Alexander Tuzhilin’s "Personalization Techniques: A Process-Oriented Perspective" (Oct. 2005), I’d like to say that personalization is a social servo-mechanism. The sensed inputs, control strategy, and consequent environmental modification—in this case personal and group behavior—all have social meaning regarding interactions among humans (and among machines).
Though Adomavicius and Tuzhilin concentrated on economic consumption, personalization techniques apply to any social setting. For example, a political advocacy portal might use them to more effectively deliver information (or propaganda). In a related example, a university computer science department might employ graduate students to build an online personalized recruiting system. Similar comments apply to interest groups and religions, even cults like us Firefly fanatics.
Surprisingly, the social servo-mechanism model applies even to the most traditional of data processing applications; for example, payroll and billing systems form feedback loops cycle by cycle, manipulated by management to encourage or discourage certain behaviors.
The so-called virtuous cycle is simply a positive feedback control strategy where every output is intended to stimulate more of the same in the environment. Using personalization to maintain market share might follow a negative control strategy where each output aims to offset drift in the environment away from the desired state (such as selectively drawing customer attention to certain products), a fine strategy for, say, a site selling wine.
Personalization itself becomes one more form of "transfer function." Positive feedback in conventional control theory is viewed with suspicion because it leads in the limit to an out-of-control situation (such as the screeching noise of feedback in a public address system). Much of what appears to be positive feedback in social systems is only the initial strategy for achieving a desired state; conventional negative feedback then kicks in to keep the system stable—a so-called mature market.
Meanwhile, Adomavicius and Tuzhilin failed to address what may be the single most important aspect of personalization—security. If personalization encompasses the rules that adjust system behavior to match the user, then surely they must include the subset of rules dedicated to what that user may or may not do, even what that user may or may not see. This subset defines the security rules for that person.
Our security models fail to achieve the appropriate level of personalized empowerment, particularly on the Internet. For example, why can just anybody deliver email to my inbox simply by knowing my address? A small change in the Internet email architecture from "default permit" of delivery to "default deny" would virtually eliminate the spam epidemic. The fault, in no small way, lies with the Access Control List security model.
A proactive security strategy using default-deny personalization is the only obvious way out of the brier patch of today’s Internet. It requires top-notch personalization.
Robert J. DuWors
Bellevue, WA
Authors Respond:
Personalization is not limited to economic contexts, as DuWors correctly points out. We illustrated our discussion with e-commerce examples, but our framework is general enough that it can be applied to other personalization contexts.
Control theory, which still needs to be explored in the personalization literature, further reinforces the idea that personalization should be viewed as an iterative process with integrated feedback mechanisms. However, the virtuous cycle is not limited to positive-feedback control strategies. The purpose of the feedback loop is to optimize the personalization strategy in light of the selected measure(s). Therefore, control strategies may involve positive or negative feedback, depending on the specific signals provided by these measures.
Finally, while we agree that security is an important issue in personalization, it is orthogonal to the subject of our article, because the need for the personalization process is not eliminated when even the best security controls are in place.
Gediminas Adomavicius
Minneapolis
Alexander Tuzhilin
New York
Why the Internet Is Good for Democracy
Eli M. Noam’s thesis in "Why the Internet Is Bad for Democracy" (Oct. 2005) was fundamentally false. His reasoning was generally analogous to the rebuttal of a clever child, who, when admonished by his parents to eat his vegetables because they are good for him, ripostes that not all vegetables are healthy all the time.
Noam began by saying that Internet observers often commit the error of composition but failed to provide an example. He then identified the error of inference, the example being that if the Internet is good for democracy in Iran, Libya, and North Korea, it does not necessarily mean it would be better for Denmark, Germany, and the U.S. A moment’s thought leads to the conclusion that in the former a far greater impact would be expected, given that democracy already exists in the latter.
Noam then adopted a scattershot approach, spraying and mixing various arguments. Among them were that more political activists will use the Net for their causes, but so too will more of their rivals, producing stalemate. And that more truth will be available through the Net, but so too will (even) more hogwash, and the truth will be lost. Nonsense. The premises may be true, but the conclusions are false. Finding the truth has always been difficult and will remain so. But the point is that a person who wishes to get at the truth will be able to discover it quicker with the Net than without the Net.
Noam then said that email may provide the illusion of access to public officials who tend to respond to numbers, regardless of whether they come in the form of email or postal mail. He followed with the unsubstantiated claim that the Net undermines political parties and stability, even in democracies. He also said that the most stable democracies are characterized by a certain slowness of change, citing Britain. While true that Britain is an example of a stable democracy, it is also true that few countries have seen more dramatic changes over the past 100 years than Britain.
None of Noam’s arguments lead to the conclusion that the Internet is bad for democracy. In fact, the opposite is generally true. The Internet is credited by some with, for example, having aided Mikhail Gorbachev survive the failed 1991 Soviet coup and Victor Yushchenko triumph in the 2004 Orange Revolution in Ukraine. Perhaps most telling is the fact that a giant dictatorship like China ruthlessly censors the Net.
Alex Simonelis
Montreal
Join the Discussion (0)
Become a Member or Sign In to Post a Comment