Skip to content ↓

Broadband picture may not be so bleak

A new study disputes the claim that Internet data rates in the U.S. are only half as high as advertised; study’s authors call for better data.

In March, the Federal Communications Commission released its National Broadband Plan, in which it reported that “the actual download speed experienced on broadband connections in American households is approximately 40-50% of the advertised ‘up to’ speed to which they subscribe.” That finding, which the FCC had previously cited, caused some consternation among bloggers and op-ed writers, to say nothing of broadband subscribers.

But a new study by MIT researchers calls it into question. Most of the common methods for measuring Internet data rates, the researchers conclude, underestimate the speed of the so-called access network — the part of the Internet that Internet service providers control. The number of devices accessing a home wireless network, the internal settings of a home computer, and the location of the test servers sending the computer data can all affect measurements of broadband speed.

The researchers don’t cast their findings as supporting any particular policy positions. But they do argue that everyone with an interest in the quality of broadband access — governments, service providers, subscribers, and market analysts — should be more precise about what they’re measuring and how. “If you are doing measurements, and you want to look at data to support whatever your policy position is, these are the things that you need to be careful of,” says Steve Bauer, the technical lead on the MIT Internet Traffic Analysis Study (MITAS). “For me, the point of the paper is to improve the understanding of the data that’s informing those processes.”

In addition to Bauer, the MITAS team includes William Lehr, an economist, and David Clark, a senior research scientist at the Computer Science and Artificial Intelligence Laboratory who from 1981 to 1989 was the Internet’s chief protocol architect. The researchers analyzed a half-dozen different systems for measuring the speed of Internet connections, from free applications on popular websites to commercial software licensed by most major Internet service providers (ISPs). Both MITAS and MIT’s Communications Futures Program, which also supported the study, receive funding from several major telecommunications companies.

In each case that the study examined, the underestimation of the access networks’ speed had a different cause. The study that the FCC relied upon, for instance, analyzed data for broadband subscribers with different “tiers of service”: Subscribers paid differing fees for differing data rates. But the analysts didn’t know which data corresponded to which tier of service, so they assumed that the subscription tier could be inferred from the maximum measured rate. The MITAS researchers show that, in fact, the subscribers in lower tiers sometimes ended up getting higher data rates than they had paid for. In the study cited by the FCC, exceptionally good service for a low tier may have been misclassified as exceptionally bad service for a higher tier.

In other tests, inaccurately low measurements were the result of an idiosyncrasy of the Transmission Control Protocol (TCP), the software that determines how Internet-connected computers exchange data. With TCP, the receiving computer indicates how much data it is willing to accept at any point in time; the sending computer won’t exceed that threshold. For some common computer operating systems, however, the default setting for that threshold is simply too low. 

In practice, many applications get around this constraint by opening multiple TCP connections at once. But if an Internet speed test is designed to open only one TCP connection between two computers, the computers can’t exchange nearly as much data as they would if they opened multiple connections. Their data rates end up looking artificially low.

In yet another case, Bauer was running a popular speed test on his own computer. Much of the time, he was getting rates close to those advertised by his ISP; but one afternoon, the rate fell precipitously. For days, the test had been pairing Bauer’s computer in Cambridge with a test server in New York. But on the afternoon in question, the New York server was overburdened with other requests, so it redirected Bauer to the nearest free server it could find — in Amsterdam. The long sequence of links, including a transatlantic link, between his computer and the test server probably explains the difference in data rates, Bauer says. His ISP’s access network may not have been any more congested than it had been during the previous tests.

This points to the difficulty of using a single data rate to characterize a broadband network’s performance, another topic the MITAS researchers address in their paper. “What is it that people care about if they want to compare a metric of merit?” Lehr asks. “If you’re watching lots of movies, you’re concerned about how much data you can transfer in a month and that your connection goes fast enough to keep up with the movie for a couple hours. If you’re playing a game, you care about transferring small amounts of traffic very quickly. Those two kinds of users need different ways of measuring a network.”

The researchers have submitted their report to both the FCC and the Federal Trade Commission and will present a version of it at the Telecommunications, Policy, and Research Conference in Arlington, Va., in October. “This report from Dave, Steve Bauer, and Bill Lehr is the first comparative study that I’ve seen,” says FCC spokesman Walter Johnson. As Johnson points out — and the MITAS researchers acknowledge in their paper — the FCC is currently in the early stages of a new study that will measure broadband speeds in 10,000 homes, using dedicated hardware that bypasses problems like TCP settings or the limited capacity of home wireless networks. “What we’re doing right now," Johnson says, “is a follow-up to the broadband plan, recognizing that we need better data.”


Related Links

Related Topics

More MIT News

Gene Keselman headshot

Faces of MIT: Gene Keselman

At MIT, Keselman is a lecturer, executive director, managing director, and innovator. Additionally, he is a colonel in the Air Force Reserves, board director, and startup leader.

Read full story