Nearly two centuries ago, the English chemist Humphrey Davy wrote "Nothing tends so much to the advancement of knowledge as the application of a new instrument. The native intellectual powers of men in different times are not so much the causes of the different success of their labors, as the peculiar nature of the means and artificial resources in their possession." Davy's observation that advantage accrues to those who have the most powerful scientific tools is no less true today. In 2013, Martin Karplus, Michael Levitt, and Arieh Warshel received the Nobel Prize in chemistry for their work in computational modeling. The Nobel committee said, "Computer models mirroring real life have become crucial for most advances made in chemistry today,"17 and "Computers unveil chemical processes, such as a catalyst's purification of exhaust fumes or the photosynthesis in green leaves."
Whether describing the advantages of high-energy particle accelerators (such as the Large Hadron Collider and the 2013 discovery of the Higgs boson), powerful astronomy instruments (such as the Hubble Space Telescope, which yielded insights into the universe's expansion and dark energy), or high-throughput DNA sequencers and exploration of metagenomics ecology, ever-more powerful scientific instruments continually advance knowledge. Each such scientific instrument, as well as a host of others, is critically dependent on computing for sensor control, data processing, international collaboration, and access.
Ignored or unacknowledged by most people, seismic exploration has been quietly dealing with big data/big compute issues since the 1970s. (We got it.)
Displaying 1 comment