In "designing the Perfect Auction" (Aug. 2008), Hal R. Varian noted that such auctions have many practical and obvious applications, including in Web advertising, cooperative robotics, digital business ecosystems, digital preservation, and network management. Auctions, by means of complementary community currencies, can also radically shift the way we conceive scientific cooperation. As we advocated in our paper "Selecting Scientific Papers for Publication via Citation Auctions" (IEEE Intelligent Systems, Nov./Dec. 2007), replacing peer review with an auction-based approach would benefit science in general. The better a submitted paper, the more complementary scientific currency its author(s) would likely bid to have itpublished. If the bid would truly reflect the paper’s quality, the author(s) would be rewarded in this new scientific currency; otherwise, the author(s) would lose the currency.
For all scientists, citations are a form of currency available worldwide, unlike the legal national currencies, which are scarce, especially in the third world. Auctions using citations as currency ("citation auctions") would encourage scientists to better control the quality of their submissions, since those who are careless risk being dropped from the system. Scientists would also likely be more motivated to prepare worthwhile talks concerning their accepted papers and invite discussion of their results by their peers. Scientists would also likely focus on fewer papers and market them better. Citation auctions could thus greatly improve scientific research, helping it shift from peer review as the reigning selection method toward a continuously improving process of selection based on auctions.
Calculating the value of a work of art or historical document is clearly difficult, and projecting that value into the future is even more difficult. The same holds when trying to calculate the current and possible future value of a scientific work. In a sort of back-to-basics movement, like science in the 18th and 19th centuries, that calculation could now be updated through citation auctions. Peer review would continue, though in a more proper place in the scientific production chain—before selection for publication—rather than as the sole selection step.
This distributed-algorithmic mechanism would provide an interesting theoretical framework for incorporating incentives into algorithmic design, with bidding using an uncertain valuation of a work’s quality, senior scientists helping their younger counterparts enter the scientific system, the marketing of scientific work through recommender systems, the avoidance of citation inflation, the creation of banks of citations, and improved auction mechanisms.
Josep L. de la Rosa and Boleslaw K. Szymanski, Troy, NY
Not Only in the U.S.A.
We all know about the internationalization of computer applications, making them easily translatable into a variety of languages, dialects, and currencies. But what about the internationalization of the editorial content of Communications?
The recent redesign (beginning July 2008) prompts me to suggest another change to address something that has been niggling at me for years. Communications articles often seem to assume that all readers are in the U.S. An example is the otherwise excellent "Envisioning the Future of Computing Research" by Ed Lazowska (Aug. 2008) in which Lazowska referred to such institutions as "the National Science Foundation" and "the National Academy of Engineering." A couple of tweaks by an editor would have turned it into "the U.S. National Science Foundation" and "the U.S. National Academy of Engineering," acknowledging that not all readers think of these bodies as their own national institutions. Lazowska also invited participation in the Computing Community Consortium, which is funded by the U.S. NSF, all of whose current council members appear to be based in the U.S. It would be useful to know whether the invitation extends to all ACM members or just to those in the U.S.
"Internationalizing" Communications content would allow all readers to quickly evaluate its articles for personal relevance—yet another benefit from the magazine’s redesign.
Jamie Andrews, London, Ontario, Canada
Moaning About the Dearth of Native Talent
I must take issue with Eric Roberts’s straw-man argument in his "Counterpoint" in the "Viewpoint" "Technology Curriculum for the Early 21st Century" (July 2008). In the real world, Microsoft might hire a candidate from Bangalore, then wait for more candidates from Bangalore, even while whining that there are no qualified candidates in the U.S.
All companies look to control costs, especially fixed ones, even at the expense of short-term return, since, projecting into the future, the marginal return is less likely to stay positive for more highly compensated employees. The desire to control fixed costs also contributes to demand for consultant positions, as they are eliminated more easily.
I know from personal experience how different reality is from the picture Roberts painted. I have no problem with companies trying to find the lowest-cost qualified labor but am disgusted by disingenuous moaning about the dearth of native talent.
Wayne Warren, San Antonio, TX
A Message Even in Knuth’s Typography
I was introduced to Donald E. Knuth’s masterwork The Art of Computer Programming in the late 1980s upon my arrival at college, and while I never fully mastered it, I found it to be a handy tool for accomplishing things that just weren’t possible on the PC-based word processors of the time.
I was struck by the irony of Knuth’s quote "I couldn’t stand to see my books so ugly" while spelling the name of his software "TeX" as he reminisced in the concluding part of Edward Feigen-baum’s interview with him "Donald Knuth: A Life’s Work Interrupted" (Aug. 2008). Microsoft products have evolved to the point where it’s now possible to render the correct spelling—"TEX"—with subscript capital E and condensed, kerned character spacing and still manage to email it intact.
Whenever I stumble across "TeX," I recall being scolded about that spelling in the introduction to its instruction manual.
Michael Pelletier, Merrimack, NH
Include These Programming Voices Too
Though Peter J. Denning’s take on programming in his "Profession of IT" "Viewpoint" "Voices of Computing" (Aug. 2008) hit the mark, I’d like to acknowledge the importance of two other roles (voices):
Maintainer. Tries to understand, correct, and improve the "product," though years later may pay dearly for poorly designed coding and a lack of documentation; and
Operations manager. Ensures everyday user service based on the availability of programs—the point of producing programs in the first place.
Requiring programmers to serve some of their apprenticeship as maintainers would help them understand what is important in the conception, design, implementation, and operation of programs. College and university courses that more accurately reflect all aspects of a program’s lifespan—from conception to decommissioning—would certainly contribute to their professional development.
Brian Kirk, Painswick, U.K.
How to Know When Important Details Are Omitted
Though the issue explored by Mark Guzdial, in "Paving the Way for Computational Thinking" (Aug. 2008) was important, it carries an equally important caveat. There is no reason to assume, a priori, that every important concept in computation has a natural counterpart in precomputational thinking; some, indeed, do not.
This theme of making thinking about computation more natural has come up many times and, to my knowledge, always carried a tacit assumption that it can be taught in a way that is natural to newcomers. Guzdial’s examples of students’ propensity to omit an *else*
clause in conditional statements illustrate the point. This is a case of giving ambiguous instructions and assuming the instruction-follower will correctly infer and carry out the appropriate action.
This cannot be fixed by making computers better at guessing how to resolve ambiguous or incomplete instructions. Developing the skill to recognize when important details are omitted and make them explicit is an indispensable part of computational thinking. Moreover, it is largely a new concept to students and thus not easily made natural to them.
There are, of course, aspects of computational thinking that can be made more natural, and doing so is a valuable goal when achievable. But any such attempt must be guided by constant vigilance about what can and what cannot be made natural. Otherwise, the results degenerate into just dumbing-down the material, making it easier, perhaps, but also misleading.
Rodney M. Bates, Wichita, KS
.HK Danger ‘Under Control’
We were surprised by the McAfee, Inc. research findings reported in the "News" item "Dangerous Web Domains" (Aug. 2008) and would like to add the following:
Old data. McAfee seemed to be describing the situation in 2008 but collected its data in 2007. While it said that 9.9 million Web sites were tested, most of the malicious ones were tested months before and may no longer exist;
New controls. Since March 2007, the Hong Kong Internet Registration Corporation Limited (HKIRC) has worked closely with the Office of the Telecommunications Authority of the Government of the Honk Kong Special Administrative Region, the Hong Kong Police, and the Hong Kong Computer Emergency Response Team Coordination Centre to monitor and control suspicious Web sites using the .hk domain; and
Less phishing. Beginning in 2007, HKIRC adopted measures against suspicious Web sites. The number of reports of phishing and spamvertising using .hk thus decreased 92%, from an average of 38 per day in 2007 to three per day in 2008 (January to May).
In view of these measures, HKIRC deems the situation under control.
Hong Kong Internet Registration Corporation Limited and Hong Kong Domain Name Registration Company Limited; www.hkirc.hk/
The news item said malicious activity that might be associated with the .hk domain doesn’t necessarily take place in Hong Kong or China; " The owner of a domain name could theoretically situate his or her business anywhere." McAfee declined to respond.—Ed.
No Best Way to Build a Mental Model
The six-bullet software design process Robert L. Glass outlined as a trial-and-error activity in his "Practical Programmer" column "Software Design and the Monkey’s Brain" (June 2008) is better described as a sophisticated analysis-and-design activity that includes a trial-and-error strategy, given that the purpose of the activity is to analyze a problem and create an automated solution for it.
One root of the less-than-optimal progress in software (and software tools) lies in the column’s second bullet item—"Build a mental model of a proposed solution to the problem." Nobody knows the one best way to build a mental model of a software solution. Supporting this conclusion are a large number of software strategies and artifacts, like structured programming, object-oriented programming systems, fourth-generation languages, network/ hierarchical/relational DBMSs, FORTRAN, COBOL, C, C++, and Java, some of which endure and some of which simply go extinct.
Alex Simonelis, Montreal
Join the Discussion (0)
Become a Member or Sign In to Post a Comment
Comments are closed.