Science has been growing new legs of late. The traditional "legs" (or "pillars") of the scientific method were theory and experimentation. That was then. In 2005, for example, the U.S. Presidential Information Technology Advisory Committee issued a report, "Computational Science: Ensuring America’s Competitiveness," stating: "Together with theory and experimentation, computational science now constitutes the ‘third pillar’ of scientific inquiry, enabling researchers to build and test models of complex phenomena." The report offered examples such as multi-century climate shifts, multidimensional flight stresses on aircraft, and stellar explosions.
This "third leg" of science has become a standard coin (run a Web search on this phrase!). However, this leg has been recently augmented by yet a "fourth paradigm" (or "leg") that refers to the usage of advanced computing capabilities to manipulate and explore massive datasets. For example, the decoding of the human genome in 2001 was a triumph of large-scale data analysis. Now science allegedly has four legs, and two of them are computational!
I find myself uncomfortable with science sprouting a new leg every few years. In fact, I believe that science still has only two legs—theory and experimentation. The "four legs" viewpoint seems to imply the scientific method has changed in a fundamental way. I contend it is not the scientific method that has changed, but rather how it is being carried out. Does it matter how many legs science has? I believe it does! It is as important as ever to explain science to the lay public, and it becomes more difficult to explain when it grows a new leg every few years.
Let us consider the first leg: theory. A scientific theory is an explanatory framework for a body of natural phenomena. A theory can be thought of as a model of reality at a certain level of abstraction. For a theory to be useful, it should explain existing observations as well as generate predictions, that is, suggest new observations. In the physical sciences, theories are typically mathematical in nature, for example, the classical theory of electromagnetism in the form of Maxwell’s Equations. What is often ignored is the fact that any application of a mathematical theory requires computation. To make use of Maxwell’s Equations, for example, we need to solve them in some concrete setting, and that requires computation—symbolic or numeric. Thus, computation has always been an integral part of theory in science.
What has changed is the scale of computation. While once carried out by hand, computation has required over time more advanced machinery. "Doing" theory today requires highly sophisticated computational-science techniques carried out on cutting-edge high-performance computers.
The nature of the theories has also changed. Maxwell’s Equations constitute an elegantly simple model of reality. There is no analogue, however, of Maxwell’s Equations in climate science. The theory in climate science is a highly complex computational model. The only way to apply the theory is via computation. While previous scientific theories were typically framed as mathematical models, today’s theories are often framed as computational models. In system biology, for example, one often encounters computational models such as Petri Nets and Statecharts, which were developed originally in the context of computer science.
Computation has also always been an integral part of experimentation. Experimentation typically implies carrying out measurements, and the analysis of these measurements has always been computational. Again, what has changed is the scale. The Compact Muon Solenoid experiment at CERN’s Large Hadron Collider generates 40 terabytes of raw data per second, a volume one cannot hope to store and process. Handling such volume requires advanced computation; the first level of data filtering, for example, is carried out on fast, custom hardware using FPGAs. Analyzing the still-massive amount of data that survives various levels of filtering requires sophisticated data-analysis techniques.
So science is still carried out as an ongoing interplay between theory and experimentation. The complexity of both, however, has increased to such a degree that they cannot be carried out without computation. There is no need, therefore, to attach new legs to science. It is doing fine with two legs. At the same time, computational thinking (a phrase coined by Jeannette Wing) thoroughly pervades both legs. Computation is the universal enabler of science, supporting both theory and experimentation. Today the two legs of science are thoroughly computational!
Moshe Y. Vardi, EDITOR-IN-CHIEF
Join the Discussion (0)
Become a Member or Sign In to Post a Comment