In the old days, academicians were (at least, supposedly) "disinterested" seekers of truth. Since the nationalization of scientific research in the U.S., they have, unfortunately, become parties at interest to the political process. The way to get grants and, therefore, the way to remain employed in an academic role, is to toe the line of politically correct science.
In his article, "Ocean and Climate Modeling" (Apr. 2000, p. 81), Albert Semtner stands firmly in this new tradition by repeating uncritically the Clinton-Gore position on so-called "global warming."
To wit: "... it is increasingly clear that the Earth's climate is warming in response to greenhouse gases produced by fossil fuel consumption and deforestation." No, it is decreasingly clear. So much so 17,000 U.S. scientists signed a petition opposing ratification of the Kyoto Treaty.
A full treatment of this issue is obviously beyond the scope of a "Forum" letter. I would refer curious parties to the Web site of The Science and Environmental Policy Project (www.sepp.org) for a more detailed refutation of Semtner's claims.
The primary deceptive tactic used by Semtner and others is the selective presentation of data. That is, they present only these data that support their claims, and ignore the rest (of which, in this case, there is plenty).
For example, a supporter of global warming invited me to search the Web under "Vostok Ice Cores." I found a graph indicating a 140,000-year (!) warming trend, now nearing its peak. That is, the planet is just now reaching the temperature it was 140,000 years ago. I am not aware of any evidence that Neanderthals used internal combustion engines.
For example, during a period known as the Medieval Climate Optimum (because it occurred around 1,000 years ago), the Earth was significantly warmer than it is today. So much so, in fact, that farming was common in Greenland.
A little of this deception can be inferred from Semtner's own statements. "The four warmest years in the last 600 ... occurred during the 1990s." Even if we accept this (and I don'tSemtner is ignoring a huge amount of data), it raises another question. If we are on a 600-year warming trend, what started it? There was no fossil fuel consumption or deforestation in 1400.
Semtner then goes on to say, "Mechanisms other than the greenhouse effect ... cannot explain the temperature increases of the past 100 years." Well, which is it? 100 or 600? Because there was no greenhouse effect 600 years ago. Finally, the honest claim to make would be "I [Semtner] cannot explain the temperature increases of the past 100 years using mechanisms other than the greenhouse effect." That is a very different matter from saying there can be no other explanation.
Los Angeles, CA
Albert Semtner Responds:
Mark Wallace questions the objectivity of scientists studying climate with funding from government agencies. I can assure him and other readers that funding still occurs through peer review by other scientists and not according to pre-conceived political opinions of people in national agencies. These national agencies fund studies in areas endorsed by the National Academy of Sciences and approved by Congress. Conclusions of studies are independently reached and subject to peer review.
My article had two main points: that it is possible to build reliable models of the climate system that can reproduce diverse aspects of observed climate, including El Niño and longer-period oscillations affecting people and economies, and that enormous parallel computing power is required to make detailed and accurate predictions with these models. I tried to underscore the importance of doing climate study by mentioning the fact that recent observed surface temperature of the northern hemisphere is significantly higher than in previous centuries. I also stated my opinion as a climate scientist with 30 years professional experience that it is increasingly clear the latter is due to the buildup of greenhouse gases in the atmosphere. Since Wallace questions my opinion, I shall elaborate further.
I think it is important to take politics out of the discussion and rely instead on the scientific method of testing techniques and hypotheses. In particular, an opinion poll widely circulated to people with technical backgrounds asking whether they wished to oppose the Kyoto Treaty on the basis of a one-sided position paper is not particularly relevant. It is helpful also to exclude climate changes before the era of modern instruments, because causes and effects are harder to pin down. Sticking with the 20th century and especially the last few decades, there are numerous observations of the climate system and potential mechanisms of climate change can be used first to validate climate models constructed from physical laws and then to test the hypothesis that CO2 noticeably influences climate. This has been done repeatedly by numerous climate research groups throughout the world, and the inescapable conclusion is that only when CO2 increases are included in the simulations do the predicted temperatures closely track the observed warming of recent decades.
Observed and predicted warmings strikingly exceed those expected from natural variability alone. Although one cannot rule out an unknown causal mechanism still unaccounted for in comprehensive models, this is increasingly unlikely, because the models are based on physical and mathematical laws, tested against observations, and independently constructed from first principles by numerous research groups. This is the basis for my opinion, which is shared by a large majority of specialists in the field of climate modeling and consistent with findings of the National Research Council in its recent report, "Reconciling Observations of Global Temperature Change," published by National Academy Press. Mounting evidence over the last several years continues to reinforce the likelihood of significant CO2-induced climate change.
I read guest editor Henry Lieberman's special section introduction, "Programming By Example," and the following article "Novice Programming Comes of Age" (Mar. 2000) by David Canfield Smith et al. with great interest. I agree there is a problem with current conventional programming languages, which contain arcane syntactic notations and form, leading to misconceptions. However, I question the feasibility of the proposed programming by example (PBE) "Creator," as widely acceptable for the programming curriculum or, for that matter, as a substitute for conventional languages.
In regard to learning programming using conventional languages, I went through an experience similar to the authors'. When I first started programming, I questioned the way programs were represented in a textual and linear form. Programs were divided into several groups of statements situated one after the other, thus giving the impression that every statement flows sequentially. However, in a program, when one group of statements is executed, another group may be skipped. Also, it is possible that one group of statements is executed more than once. The textual representation of programming did not help me realize the actual flow of the program and led to much confusion. A pictorial representation of the program could have eliminated this confusion.
The understanding of control flow was the most challenging part of programming for me. After 23 years of educating novice programmers, I have now found, and continue to find, students experiencing the same difficulty. Not much has been done in conventional programming languages, whether procedural or object-oriented, to tackle this problem.
While PBE is a great idea and Creator is a successful model, the question of how it will achieve its goal of being widely accepted remains. Is it the goal of Creator to take over the beginner's programming curriculum and replace the existing conventional languages?
In the past several years, I have not seen a drastic change with regard to programming languages. Any current changes in programming languages are built on the existing languages. For example, C++, Java, and others are all built on C, which itself is built on Algol and Fortran with the backing of a big enterprise or a major institution. It may be argued that even though the design intention of C was for system programming, it has received wide acceptance and has moved into the business and scientific communities. Why has this occurred? Is it a miracle, good luck, performance and production (Unix), or others reasons? The question we need to asked is, How can Creator achieve similar success and become widely accepted?
If Creator is not winning by revolution, then perhaps a transitional period could be a remedy. As indicated by the author, there is a conservative community in the programming field that simply won't accept such a drastic change. It is also important to note that this conservative community is not fading away, so a gradual or evolutionary change is indicated.
Within programming languages, we have seen poorly chosen notations that were adopted because they were the only option, due to the lack of technology, other restraints, or even personal taste. However, these notations continue to be carried on from language to language, as a bad gene continues from one generation to the next. In the past 50 years, there has been no reexamination of these languages. Based on this observation, it would be prudent to incorporate the good features of the current conventional languages into Creator as an option. This would ease the transition of traditional programming to the new PBE standard.
In addition, there should be a massive campaign advocating the benefits and broad feasibility of Creator as a general programming tool and not just as a simulation tool for children's games. For example, given the ability to solve a common programming problem, such as finding an average, a minimum, or sorting, in both PBE and a conventional language would give readers a better chance to compare both languages.
I disagree that nothing has been done to address the problem of novice programmers. In fact, there has been an ongoing effort to resolve the problem of languages written prior to PBE. Two such examples are Karel the Robot, 1981 (Karel++, object-oriented version, 1997) and my own VPCL (Visual Plan Construct Language), 1989. The empirical studies of novice programmers indicate that students who study these languages do substantially better in conventional languages. Therefore, I fully agree with the assertion that working with Creator results in better understanding of conventional languages. The only drawback is that students dislike the fact that they have to learn additional languages and then learn conventional languages.
Old Westbury, NY
I had problems understanding the first part of Tim Bass's article on intrusion detection systems. I finally figured out that he was using "ID" to mean "Intrusion Detection" instead of "identification." While there is a long history of research communities hijacking common abbreviations to mean something else (for example, ATM), I think that in articles written for outsiders, confusion can occur with common abbreviations.
This reminds me of when I was living in northern New Mexico and got strange looks from people from further out west when I said it was normally an hour drive from LA to SF, but some people did it in 45 minutes. After a while, they finally figured out that I was talking about Los Alamos and Santa Fe. But the power of their belief that LA meant Los Angeles was just too strong for them to think of an alternative meaning.
I had a similar problem with Bass's article, except that reading "identification" made just enough sense that I didn't obviously reject that interpretation.
Falls Church, VA
Because I regularly supervise theses involving ERP, I was pleased to see a special section devoted to ERP (Apr. 2000) However, I was disappointed by the section's one-sided orientation, as nearly all its articles focused on organizational issues, whereas package functionality, industrial engineering aspects, and software engineering characteristics got far less attention. In my view, Communications has not yet completely made the transition to a journal that primarily serves the needs of its readers. In a reader-oriented journal, a balanced mix of articles, tutorials, and reviews should be preferred over the best collection of articles that report current research results.
Rommert J. Casimir
Tilburg, The Netherlands
I enjoyed the issue highlighting the ERP phenomenon. Kuldeep Kumar and Jos van Hillegersberg refer to Blumenthal's 1969 proposal for an integrated architecture, with the words "as early as 1969 ..."
It might be of interest that John Simmons, the director of J. Lyons & Co. in charge of administration, published his Master Plan in April 1961. It proposed not merely the installation of more computer power but the planning of a series of interrelated divisional business applications linked together by the use of the computer.
The project was ambitiously designed in three stages that would take three to five years to complete and cost on the order of 1 million pounds sterling. The plan called for Lyons to continue to computerize business processes until a fully integrated system had been accomplished with all linkages [1-3]. A schematic chart of the Master Plan and its linkages is shown on page 131 of Bird's book.
The Master Plan was to be carried by systems development teams assigned to each of the company's four trading divisions of the company. Bird provides an interesting account of the struggle to implement the Master Plan and analyzes the reasons for its ultimate abandonment.
©2000 ACM 0002-0782/00/0700 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2000 ACM, Inc.
No entries found