Research and Advances
Architecture and Hardware Research highlights

Technical Perspective: The Realities of Home Broadband

Posted
  1. Article
  2. References
  3. Author
Read the related Research Paper

Compared to other consumer purchases, buying residential broadband services seems relatively simple: pick among a small number of plans, characterized by a one-speed number or maybe two, and then compare similar plans by price. (Assuming, of course, that there is a choice in plans and providers—a topic for another day…) If the Internet service seems slow, just upgrade to the next tier. Unfortunately, as the authors illustrate in this important contribution, reality is somewhat more complicated, with different technologies and providers delivering more or less of the promised "headline" speed, different packet latencies, and varying performance predictability.

Regulators have recognized that consumer choice requires more than the presence of multiple competing providers. Consumers also must have the ability to make informed choices. This is particularly important for residential broadband Internet access, as the expenditures are a non-trivial part of the family budget and it is often difficult and time-consuming to switch providers, with long-term contracts, installation fees, and, in some cases, having ISP technicians drill holes through home walls. In the U.S., this insight has partially motivated the FCC Open Internet regulatory proceeding, which has transparency as one of its key tenets.1 The Open Internet order mandates the pre-sale disclosure of key performance metrics to allow consumers to pick the appropriate plan.

The data and report related to the performance measurements reported here also developed out of an FCC effort, the Measuring Broadband America program. This program had a rather interesting effect on the marketplace: An ISP whose measured performance met and exceeded the promised throughput ran prime-time TV commercials touting the findings compared to a competitor. A year later, the FCC published a follow-up report,2 using the same basic methodology, and two of the under-performing companies mentioned in the paper had significantly improved their actual performance compared to the advertised rates, illustrating the old "you manage what you measure" adage.

The work presented here can also help provide a quantitative foundation for two long-running, related discussions in the network research and policy realm: First, for about 20 years, a favorite panel and conference dinner topic has been whether differentiated or guaranteed quality-of-service (QoS) classes are necessary for the new application of the day, whether voice calls or video streaming, rather than offering only one "best effort" service. So far, ISPs have been offering only best effort as a residential Internet access service; however, they are also starting to use the same IP-based delivery for various specialized, or managed services, such as voice calling or IPTV, that is, the delivery of video content over IP. Independent observers must be able to gauge whether so-called over-the-top services that can only use best-effort services can indeed be seen as competing on an equal footing with these ISP-provided services.

The following paper provides an excellent example where open data, created with the resources and coordination that typically only a government entity can provide, combined with deep data analysis and additional experiments and measurements can yield insights into a topic of interest to policymakers, researchers, and consumers.

The FCC Measuring Broadband America program was founded on principles of openness and transparency, and the raw FCC-gathered measurement data is freely available to other researchers, so that others can both replicate the results reported and investigate other aspects. For example, the data contains several other active measurements that try to predict VoIP and video streaming performance.

The authors point out that much remains to be done to improve our understanding of broadband services. They note that components such as buffers in home routers and WiFi networks as well as protocol effects can reduce the end-to-end performance. As raw access speeds increase from a few Mb/s to possibly a Gb/s, these impairments may well become dominant and will, among other problems, lead to disappointed consumers who find that paying more for a better service does not actually yield higher-quality video or faster-loading Web pages. The authors hint at the problem of network reliability. Even where raw speed is sufficient to meet home needs, the ability to consistently make phone calls that are equivalent to old "landline" quality or to work from home without worrying about losing connectivity during the day will depend upon reliability, and make for challenging future measurement projects. Also, wireless services, which are beyond the scope of the paper, raise many new challenges. The FCC and other national regulatory bodies are starting to gather data for those networks.

In the long run, consumers should not have to worry about the technical details of their Internet service, just like they do not worry (in most developed countries, at least) about whether their electricity service has enough amperage to power their newest gadget. Until that time, the work presented here will help consumers choose, policymakers protect consumers, and providers improve their services.

Back to Top

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More