Cloud computing, a term that elicited significant hesitation and criticism at one time, is now the de facto standard for running always-on services and batch-computation jobs a like. In more recent years, the cloud has become a significant enabler for the IoT (Internet of Things). Network-connected IoT devices—in homes, offices, factories, public infrastructure, and just about everywhere else—are significant sources of data that must be handled and acted upon. The cloud has emerged as an obvious support platform because of its cheap data storage and processing capabilities, but can this trend of relying exclusively on the cloud infrastructure continue indefinitely?
For the applications of tomorrow, computing is moving out of the silos of far-away datacenters and into everyday lives. This trend has been called edge computing (https://invent.ge/2BIhzQR), fog computing (https://bit.ly/2eYXUxj), cloudlets (http://elijah.cs.cmu.edu/), as well as other designations. In this article, edge computing serves as an umbrella term for this trend. While cloud computing infrastructures proliferated because of flexible pay-as-you-go economics and the ability to outsource resource management, edge computing is a growing trend to satisfy the needs of richer applications by enabling lower latency, higher bandwidth, and improved reliability. Plus, both privacy concerns and legislation that require data to be confined to a specific physical infrastructure are also driving factors for edge computing.
No entries found