The 20th century ended with the vision of a smart dust—a network of wirelessly connected devices whose size would match that of a dust particle, each one a self-contained package equipped with sensing, computation, communication, and power. The smart dust held the promise to bridge the physical and digital worlds unobtrusively, blending realms that were previously considered well separated. Applications involved scattering hundreds, or even thousands of smart dust devices to monitor various environmental quantities in scenarios ranging from habitat monitoring to disaster management.
A few years later, Jeff Kramer wrote an article in Communications entitled “Is Abstraction the Key to Computing?” (Apr. 2007). Kramer elucidated the key role of abstraction in solving computing problems. As many others also did before, he argued how the ability to focus on the essence of problems, perform abstract thinking, and discern recurring patterns represents a fundamental asset in mastering the complexity of computing systems. The power of abstraction and its conscious use applied to the quest to concretely realize the smart dust played and is still playing a key role.
Roadblocks on the way to making a reality of the vision of a smart dust were indeed many, including for example reducing the physical size of devices, achieving low-power wireless communications, and realizing efficient embedded software. Many success stories exist as of today, which eventually led us to what can be argued to be the first concrete realization of the smart dust, that is, what we call the Internet of Things. These successes built on the same abstractions used in more traditional computing systems, including layered network designs and store-and-forward packet switching.
Abstractions may, however, unnecessarily separate system aspects that are intimately related or unintentionally conceal crucial operating parameters. Inherent resource limitations and the dynamics of low-power wireless links added to the complications in tackling key challenges in this area. Among these, very few can possibly dispute that achieving deterministic network behaviors in networks of low-powered embedded devices is one such challenge. Deterministic network behaviors are essential in many applications, including real-time control loops and robotics. Traditional networking abstractions are unable to capture many of the relevant aspects that are pivotal to achieve these behaviors.
By distilling the essence out of efforts in this area—including the following paper—one may observe that these issues are often best addressed by breaking abstractions boundaries or by creating new ones. Layered designs have been the first abstraction to be let go. With synchronous transmissions, low-power wireless protocols embrace the possibility of concurrent channel access: the tenet of synchronous transmissions is that collisions are not necessarily destructive. This allows protocols using synchronous transmission to embrace the broadcast nature of low-power wireless communication instead of hiding it under the artificial, and somehow unnecessary, abstraction of point-to-point links that most wireless transmissions usually adopt. Time-triggered coordinated channel access then becomes an instrument to create designs achieving deterministic end-to-end latency in multi-hop low-power wireless networks.
I expect the design ideas embedded within the realization of Zero-Wire to inspire many in the years to come.
The authors take a further step as well as rethink the dear old store-and-forward packet switching, which represents a cornerstone of traditional network designs. They conceive a notion of symbol-synchronous bus, which effectively makes a multi-hop wireless network behave like a single wire. Synchronous transmissions do not occur for entire packets, but only at the level of the individual symbols that encode a finite amount of information, say a few bits, on the wireless channel. Interestingly, they demonstrate this concept with optical wireless communications in place of the more commonplace radio-frequency transmissions, further harvesting the advantages due to the inherent resilience to interference and the abundance of unregulated spectrum.
The real-world assessment of the performance of Zero-Wire represents a remarkable effort per se, as it makes the authors’ contribution concrete and tested against real-world dynamics. Confronted with the lack of large-scale public testbeds useful for performance measurements of their specific work, the authors build their own facility and make everything they created or used available for the community to build on. The performance they report on, including sub-ms end-to-end latency, provides evidence that their design choices, including rethinking the common store-and-forward technique, ultimately pay off.
I expect the design ideas embedded within the concrete realization of Zero-Wire to inspire many in the years to come. Besides the specific technical contribution, this paper is an example to follow and an invitation for others to think out of the box, challenge established concepts, and take nothing for granted.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment