Advertisement

Wake Up and Smell the Coffee: Evaluation Methodology For the 21st Century

Evaluation methodology underpins all innovation in experimental computer science. It requires relevant workloads, appropriate experimental design, and rigorous analysis. Unfortunately, methodology is not keeping pace with the changes in our field.

Anton, A Special-Purpose Machine For Molecular Dynamics Simulation

The ability to perform long, accurate molecular dynamics (MD) simulations involving proteins and other biological macro-molecules could in principle provide answers to some of the most important currently outstanding questions in the fields of biology, chemistry, and medicine. A wide range of biologically interesting phenomena, however, occur over timescales on the order of a millisecond---several orders of magnitude beyond the duration of the longest current MD simulations. We describe a massively parallel machine called Anton, which should be capable of executing millisecond-scale classical MD simulations of such biomolecular systems. The machine, which is scheduled for completion by the end of 2008, is based on 512 identical MD-specific ASICs that interact in a tightly coupled manner using a specialized highspeed communication network. Anton has been designed to use both novel parallel algorithms and special-purpose logic to dramatically accelerate those calculations that dominate the time required for a typical MD simulation. The remainder of the simulation algorithm is executed by a programmable portion of each chip that achieves a substantial degree of parallelism while preserving the flexibility necessary to accommodate anticipated advances in physical models and simulation methods.

The Emergence of a Networking Primitive in Wireless Sensor Networks

The wireless sensor network community approached networking abstractions as an open question, allowing answers to emerge with time and experience. The Trickle algorithm has become a basic mechanism used in numerous protocols and systems. Trickle brings nodes to eventual consistency quickly and efficiently while remaining remarkably robust to variations in network density, topology, and dynamics. Instead of flooding a network with packets, Trickle uses a "polite gossip" policy to control send rates so each node hears just enough packets to stay consistent. This simple mechanism enables Trickle to scale to 1000-fold changes in network density, reach consistency in seconds, and require only a few bytes of state yet impose a maintenance cost of a few sends an hour. Originally designed for disseminating new code, experience has shown Trickle to have much broader applicability, including route maintenance and neighbor discovery. This paper provides an overview of the research challenges wireless sensor networks face, describes the Trickle algorithm, and outlines several ways it is used today.

What Makes an Interface Feel Organic?

Movies can make us forget that we are sitting in a cinema among strangers, looking at images projected onto a wall. Instead, we feel as though we are observing real people in real situations and we become emotionally involved in the narrative. User interfaces can trigger a similar suspension of disbelief: we forget we are […]

Introduction

Throughout the history of computing, developments in human-computer interaction (HCI) have often been preceded by breakthroughs in display and input technologies. The first use of a cathode ray tube (CRT) to display computer-generated (radar) data, in the Canadian DATAR [6] and MIT’s Whirlwind projects of the early 1950s, led to the development of the trackball […]

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More