NDT Data in NTIA Indicators of Broadband Need

The National Telecommunications and Information Administration’s (NTIA) recently released a new public map, the Indicators of Broadband Need. Pulling together different sources of data in this excellent, publicly available resource is helpful to communities as they plan how and where to improve broadband services for their residents. Historically, many factors have made it difficult for communities in the US to address digital inequities through federal subsidies, notably the well publicized inaccuracies of federal data sources on broadband deployment from the FCC. This process is changing and hopefully improving at the FCC. But the landscape of assessing or measuring who does and doesn’t receive quality and affordable Internet service is also complicated by the conflation of multiple measurement data sources covering different aspects of Internet connectivity and user experience. The different data layers in the Indicators of Broadband Need provide a chance to step back and examine all currently available sources, understand what they are measuring, how they differ, and what aspects of Internet service are not yet being measured, but should be. The Internet is a complex system, and the reality is that no one measurement methodology or data source is sufficient to measure its performance.

Data Sources Included in NTIA Maps

Both the public indicators map and the NTIA’s National Broadband Availability Map (NBAM) platform pull together publicly available data sources on broadband, including the American Community Survey from the US Census Bureau, the FCC’s Form 477 provider reports, aggregate speed test data from Ookla and M-Lab, and aggregate application level speed measurements from Microsoft. The NBAM tool in general has been available to state and tribal governments’ broadband offices for some time now, so it’s really great to see the Indicators of Broadband Need resource available publicly.

One of the new data layers in both the Indicators of Broadband Need map and the NBAM tool is from Measurement Lab (M-Lab). NTIA is showing the median upload and download rates by US county between January and June 2020, as measured by the Network Diagnostic Tool (NDT). The inclusion of NDT aggregate data from M-Lab is the result of our collaboration and support of NTIA since 2018. NTIA was directed by Congress in 2018 with updating the National Broadband Availability Map with third party datasets related to broadband services in the US, in coordination with the Federal Communications Commission (FCC).

To get this data into the Indicators of Broadband Need map and NBAM platform, NTIA is using methods and queries adapted from the queries we use to produce our statistics API service, which we soft-launched earlier this year. The statistics API currently provides aggregated statistics by geography based on our own set of recommendations for how to best use NDT data in policy & advocacy analyses. Recognizing that people and organizations often do not have the expertise themselves to analyze data from large datasets like ours, we developed these recommendations and the statistics API in an effort to properly represent this data, and to provide correct context on how it compares and contrasts to other, similar looking datasets, such as the FCC’s Form 477 data and median speed measurements from Ookla.

To make the most of all of this data, it is important to consider the nuances of each dataset, specifically how they relate to one another, and to the 25/3 broadband standard (and future standards) in general. Though Ookla and NDT both measure “speeds” and “latency”, each test and each platform have different methodologies and server topologies. Therefore, their results have similar metrics, but technically don’t measure the same things. Similarly, the Microsoft measurements collected when people access their online services or updates present yet another methodology, but with the same metrics of upload and download speeds.

Having these data in a tool alongside other publicly available data sources to compare with the FCC’s Form 477 provider reports is a really excellent resource for communities planning how to best prioritize where to focus infrastructure money from state and federal programs. That said, it’s important to understand that different measurements tell us different things, and aren’t the same. Aside from the methodologies, the terms themselves are nuanced and the whole landscape of assessing the speeds, latency, and quality of service that people receive has been highly politicized. Our goal at M-Lab is to explain what NDT measurements are, and how they compare/contrast with other data sources.

In brief summary, the NTIA Indicators of Broadband Need and NBAM include the following data sources covering January - June 2020:

  • FCC Form 477 - Census blocks where no providers report fixed broadband maximum advertised speeds >= 25/3
    • Self-reported, advertised max speeds, not measurements
  • Ookla Speed Tests - Census tracts where median speeds from fixed broadband connections are above or below 25/3
    • Multi-stream measurement designed to measure full access link capacity, ideally to servers within the last mile provider network
    • Aggregated metrics include only tests with GPS quality location
  • NDT Speed Tests - Counties where median speeds of both mobile and fixed broadband connections combined are above or below 25/3
    • Single-stream test designed to measure TCP’s baseline performance to servers outside the last mile network
    • Location determined by IP address geolocation
  • Microsoft Speed Measurements - Percentage of users by county and zip code above or below 25/3
    • Measurements taken when users of Microsoft applications or services access those services
    • Location (zip code) determined by reverse IP address lookup

So let’s dig into some details to understand the differences and context to the various datasets in the NTIA Indicators of Broadband Need Map, as well as our understanding of how each relates to the 25/3 national broadband standard.

Measuring the National Broadband Standard - 25 Mbps Download, 3 Mbps Upload

It’s useful to start with a base understanding of the 25/3 broadband standard, and how the FCC measures that annually. You may be familiar with their Form 477 provider reports, but the agency also runs an annual measurement study to assess connectivity of the service tiers of participating providers. In terms of how the FCC measures whether providers are meeting the 25/3 standard, it’s the Measuring Broadband America (MBA) program that is linked in the FCC’s Broadband Speed Guide and the Household Broadband Guide for more information about speeds related to the standard. The MBA measurement program is the closest thing to a federal measurement standard assessing the 25/3 broadband standard. The MBA report is limited in scope to ISPs who voluntarily participate and provides aggregate measurements of their different service tiers across a nationwide sample rather than specific geographic areas of the US. In 2019 for example, 10 ISPs participated in the study, across 12 access technologies. Data was collected from 2,931 volunteer panelists selected to be geographically diverse over the four Census regions in the US. Measurements for the annual study are collected for a two month period for the study, followed by analysis and reporting 1.

The MBA program places a router at the homes of participating subscribers, across various service plans and access media. The router software runs a variety of measurements over a defined period of time for each annual study, and the aggregated results are published in a report covering that year’s results. The measurements included in the MBA program reports are conducted from consumer subscription locations to servers hosted, “outside the network boundaries of the participating ISPs.” 2 This is called an “off-net” topology, referring to how servers are located in relation to the location where the tests are initiated. Additionally, some ISPs participating in the program host “on-net” test servers within their networks, which are used within the MBA program to, “provide additional validity checks and insight into broadband service performance within an ISP’s network.” 3 The MBA program collects multiple measurement tests which are described in the Technical Appendix of the latest MBA report 4. Of note to this discussion, the download and upload speed measurements use three TCP connections during the measurement, as we’ll discuss in more detail below.

The structure of the MBA program can help us understand the ways in which the broadband standard is framed at the federal level in terms of measurement, but it’s also only part of the landscape of Internet measurement in the US. The program’s measurement and reporting focus on assessment of ISP speed tiers using a geographic sampling method, so it provides a good general overview of the state of ISP service for participating providers and their service tiers. But most communities are interested in more specific and more data generally about the measured state of Internet service in their city, county, state, etc. for all ISPs in their region. And while the FCC’s Form 477 provider reports give us the advertised maximum speeds for providers at the Census block level, it’s well known that this data is not representative in many cases, as it is self-reported by providers without validation. The FCC is currently working to update their process for data to undergird its current broadband maps through their Broadband Data Collection program 5. This is where crowdsourced measurement platforms like Ookla and M-Lab have filled a gap in the general understanding of the state of Internet service, by enabling anyone to measure their Internet connection. These platforms themselves differ, and their tests have different methods that we’ll discuss next, but both take a similar approach to measurement as the FCC MBA program in that they use a dedicated server platform which clients or tests use for the sole purpose of measuring the connection. Other measurements such as those provided by Microsoft in the NTIA indicators of need map take a different approach, conducting measurements when applications access hosted services like Windows update 6.

Different platforms and their measurements tell us different things

While the MBA Program, Ookla, and NDT provide similar metrics (download & upload speed, and latency), their respective platform topologies and each tests’ measurement methodologies are very different. Platform topology refers to how servers are located in relation to where people running tests are located, and measurement methodology refers to how each test is designed to work and what each measures.

Related to NTIA’s data sources, we can begin to identify similarities and differences between the platform and measurements of the MBA program, Ookla, and M-Lab. First, the server location for tests in the MBA program and M-Lab NDT tests are located off-net, which might also be called “middle mile” in the general topology of the Internet. Ookla encourages third parties hosting their servers to locate them as close to their users as possible, or on-net. And while people running an Ookla test may select a specific server to test against, M-Lab and MBA don’t allow the person running the test to select an individual server.

Both NDT hosted on the M-Lab platform and the MBA program’s speed measurements are conducted between the MBA’s router within the participant’s home network and one server. As of at least May 2021, each test run from Ookla’s web, desktop, and mobile platforms use multiple servers to spread the test load 7. MBA and Ookla measurements are conducted to the server(s) closest to the tester geographically with the least latency, while M-Lab’s NDT measurements are run to the closest server geographically regardless of latency.

In addition to the differences in platform topologies, MBA, Ookla, and M-Lab, there are also differences in the test methodology each test uses. Ookla’s Speedtest.net uses multiple “streams” or connections, now spread across multiple nearby and least latent servers, in an attempt to measure the maximum access link capacity of the ISP’s network. M-Lab’s NDT test uses a single stream or connection to measure the capacity of an end-to-end path from the person running the test to the geographically closest off-net server on our platform. And the MBA program uses a multi-stream test to a single off-net server that is geographically closest and least latent.

Both single-stream and multi-stream measurements provide valuable insight into a user’s Internet performance. Multi-stream measurements, such as the one provided by Ookla and the MBA, open up multiple data streams over a user’s connection. This approach is designed to emulate a modern browser. It can also partially mask data delivery problems, such as when one set of streams pick up unused capacity left by another set of streams that are performing poorly due to packet loss or congestion elsewhere in the network. But by overcoming these sorts of problems, multiple streams are able to return measurements closer to link capacity, or the maximum amount of data can fill the link.

Single-stream measurements do not attempt to emulate a browser, but do reflect the performance of the basic building block for nearly all applications, that is, the single streams themselves. Since separate streams cannot compensate for each other’s problems they are far more sensitive to data delivery issues such as externally caused packet loss and therefore their measurements reflect an unaugmented model of TCP’s behavior. Single stream-measurements like NDT do not measure link capacity, but a measurement of TCP’s performance 8. In this sense NDT is a baseline measurement for a connection’s performance. At M-Lab we encourage the use of measurements of both the single and multi-stream variety and would welcome the contribution of a multi-stream test to our open source platform in the future.

The table below compares the FCC MBA program, Form 477 provider reports, M-Lab’s NDT test, and Ookla’s Speedtest.net test.

  FCC MBA Report FCC Form 477 M-Lab NDT Ookla Speedtest
Server Location All servers in report are off-net, tests conducted to servers with least latency 9 Not applicable. Off-net, tests conducted to the closest server geographically All servers encouraged to be on-net, tests conducted to servers with least latency 10
Servers Used per Test 1 Not applicable. 1 4+
# TCP Streams 3 Not applicable. 1 4+
Measurement Specification/Description Multi-stream TCP measurement estimating link capacity. Not applicable. Single stream TCP measurement of Bulk Transport Capacity. Multi-stream TCP measurement estimating link capacity.
Geographic Precision Not applicable. The MBA program reports on measured service tiers of participating ISPs, however the Census block for tests from each participating household is identified. Census Block. Aggregate by any geography, though geographies smaller than county or city are not recommended. Provided in Shapefile and Apache Parquet map tiles, aggregated in a grid of ~610.8 meters by 610.8 meter tiles. 11
Metrics Weighted average and median download & upload speeds per service tier; % speeds experienced for at least 80% of the daily peak use period; latency & packet loss. ISP reported maximum upload and download link capacity by access media (fiber, cable, dsl satellite, etc.). Download & upload speed, latency. Per tile, weighted average download/upload speeds, average latency, # tests, # devices.
Mobile vs. Fixed Fixed & mobile measured separately. Fixed & mobile measured separately. Fixed & mobile results combined, aggregable using third party datasets. Fixed and mobile provided as separate tile sets & shapefiles.
Provider Information Participating providers and service tiers. Provider Name, DBA “Doing business as” Name, Holding Company Name ASN - Autonomous System Number/Name Not provided in Ookla for Good aggregations. May be available in other Ookla product offerings.
Data Collection/Access Tests initiated by a premise device at participating households. Validated data, and raw collected data are published. 12 13 ISP-contributed, free and open access to aggregate data. Collection method left to the ISP. User-contributed, free and open access to individual data points. Open source server run by M-Lab. User-contributed, free and open access to aggregate data. Closed source server that is available for others to run.

Microsoft’s Application Level Measurements

The measurements from Microsoft in the NTIA tools take a different approach than the other sources discussed above. Instead of a person initiating a test in a browser, or a device placed on site that runs the tests, Microsoft, “estimate[s] broadband usage by combining data from multiple Microsoft services. … Every time a device receives an update or connects to a Microsoft service, we can estimate the throughput speed of a machine.” 14 The measurements from Microsoft are at the “application level”, meaning a measurement of your computer’s interaction with their specific services and applications. This is different from all the other measurements discussed above, which look at the bandwidth and latency of the connection to one or more servers. While those methods are useful to assess speeds to/from servers within the middle mile or ISP network, measuring the performance and availability of an application or service online is measuring something different than the methodologies described above.

Which tests tell us whether we’re getting the service we’re paying for?

The answer to this question is what everyone wants from speed tests, but it’s also not really possible to get this answer from any of the tests we’ve discussed. Most Internet service plans provide a connection “up to” the speeds listed in the service tier you purchase. If an Internet service provider’s Terms of Service(ToS) mentions speeds at all, then they are likely, “promised speeds ‘up to’ the connection listed in your plan” 15. Unless you’re purchasing a business class Internet plan with a Service Level Agreement (SLA) guaranteeing specific speeds, getting slower speeds than the ones advertised by your ISP are not breaking any regulations of consumer protections defined by either the government or their own agreement with you as a subscriber. So in a general sense, no tests measure whether you’re “getting what you pay for” from your ISP. But if ISPs did commit to the upload and download speeds that they advertise in their ToS, what test(s) would be appropriate to measure those metrics?

ISPs provide a service connecting you to the Internet via a network that they maintain, and the maximum “up to” speeds over that network would be best measured by a multi-stream test to one or more servers within each ISP’s network. Ookla’s test measurements are a good example. NDT from M-Lab on the other hand, isn’t intended to be a measurement of an Internet connection’s maximum capacity but as described above, provides a baseline measurement for how well the underlying transport protocol, TCP, is doing.

So which metric should we use? There is a disconnect between providers, regulators and consumer advocates in deciding which of these metrics defines “what you are paying for”. Many argue for the significance of measuring link capacity of the access network, while others believe measuring the maximum potential alone does not provide sufficient data to assess need. Rather than debate the pros and cons of each approach, M-Lab supports the idea that it might not be possible to choose just one metric to define a healthy Internet performance. For this reason, we’re pleased to see that NTIA has integrated multiple datasets into the Indicators of Broadband Need map. Speed continues to be an heavily debated metric, but at least this way we know our options.

Analysis Matters

Understanding the measurement data source and what it intends to measure is critical when comparing or contrasting data from different sources as we’ve discussed here. Reports that aggregate results from the large number of tests from M-Lab, Ookla, or other platforms should take this into account. It’s also important to remember that each of these datasets have a massive volume, and provide opportunities for all the same pitfalls of working with big data of any other kind. For example, if we count the number of NDT and Ookla tests from Jan 1, 2020 to June 30, 2020 (the time range used in NTIA’s indicators of need map), there were 46,321,992 NDT download tests, 44,601,955 upload tests, and 60,641,490 Ookla included 16. If the data is not treated with attention to detail, context and form, it is possible, and quite easy, to draw incorrect conclusions that could have large implications. Incorrect analyses do not correlate to inaccuracies in the data itself, but will weaken the validity of any resulting conclusions. For this reason, M-Lab has provided a set of basic research recommendations for researchers to follow when integrating NDT data into an application or citing it in a report or study.

With crowdsourced data from Ookla or NDT, it’s possible to add to the picture of what broadband service looks like across all communities, but with the caveat that analyses need to take care to account for more variability in the data. In contrast, measurement initiatives that coordinate the conditions under which measurements are run, using a device placed on selected network connections control for much of the variability that crowdsourced test results contain. The FCC’s MBA program uses this method as described above. Similarly, other measurement initiatives like RIPE Atlas employ software to run tests from a dedicated device, and M-Lab also now provides Murakami, a software based, automated test runner meant to be installed on a dedicated on-premise device. Murakami takes the approach of supporting any open source test, and currently supports both NDT and Ookla.

Expanding Our View of Broadband Data

From M-Lab’s perspective, the inclusion of many broadband data sources and highlighting the differences in what they represent are a net gain for the Internet Research space; the more data, the better. Broadband offices and staff of states and tribal governments also have access to many more layers in the full NBAM tool from NTIA. It’s great to see that improving broadband service for all Americans is a priority now, and with more data with more nuanced analyses, we hope communities can begin to get a clearer picture of all aspects of Internet service and content delivery. We’re hopeful that this post provides context and deeper understanding of the NDT dataset and how it should be viewed in relation to others in the NTIA Indicators of Broadband Need. M-Lab applauds the NTIA for creating a tool enabling a broader assessment of the landscape of broadband needs across the country.

If you’d like more information about this article, contact us!

  • Lai Yi Ohlsen - laiyi@measurementlab.net
  • Chris Ritzo - critzo@measurementlab.net

For general information about M-Lab, including support for our tests and data, please email support@measurementlab.net.


  1. FCC Office of Engineering and Technology, Measuring Broadband America Measuring Fixed Broadband. Accessed 2021-06-22 at: https://www.fcc.gov/general/measuring-broadband-america-measuring-fixed-broadband 

  2. FCC Office of Engineering and Technology, Technical Appendix to the Tenth MBA Report, page 24. Accessed 2021-06-22 at: http://data.fcc.gov/download/measuring-broadband-america/2020/Technical-Appendix-fixed-2020.pdf 

  3. FCC Office of Engineering and Technology, Technical Appendix to the Tenth MBA Report, page 24. Accessed 2021-06-22 at: http://data.fcc.gov/download/measuring-broadband-america/2020/Technical-Appendix-fixed-2020.pdf 

  4. FCC Office of Engineering and Technology, Technical Appendix to the Tenth MBA Report, page 28. Accessed 2021-06-22 at: http://data.fcc.gov/download/measuring-broadband-america/2020/Technical-Appendix-fixed-2020.pdf 

  5. FCC Broadband Data Collection Program, Accessed 2021-06-28 at: https://www.fcc.gov/BroadbandData 

  6. Kahan, John; Lavista Ferres, Juan. United States Broadband Usage Percentages Dataset. Accessed 2021-06-23 at: https://github.com/microsoft/USBroadbandUsagePercentages#readme 

  7. Connelly, Brian. How Ookla Ensures Accurate, Reliable Data: A Guide to Our Metrics and Methodology (Updated for 2021). Accessed 2021-06-23 at: https://www.speedtest.net/insights/blog/how-ookla-ensures-accurate-reliable-data-2021/ 

  8. In a single multi-stream application, a single problematic stream that delays one critical resource can impair total application completion times. On the other hand, even if single stream performance is below full link rate, multi-stream applications are likely to be able to fill the link, with less risk of being blocked by any late content. https://datatracker.ietf.org/doc/html/rfc793 

  9. FCC Office of Engineering and Technology, Technical Appendix to the Tenth MBA Report, page 27, Test Node Selection. Accessed 2021-06-22 at: http://data.fcc.gov/download/measuring-broadband-america/2020/Technical-Appendix-fixed-2020.pdf 

  10. Ookla, The Speedtest Server Network, Host a Speedtest Server. Accessed 2021-06-23 at: https://www.ookla.com/speedtest-servers 

  11. Ookla, Speedtest by Ookla Global Fixed and Mobile Network Performance Map Tiles. Accessed 2021-06-28 at: https://github.com/teamookla/ookla-open-data#readme 

  12. FCC Office of Engineering and Technology, Validated Data - Measuring Fixed Broadband - Tenth Report. Accessed 2021-06-28 at: https://www.fcc.gov/reports-research/reports/measuring-broadband-america/validated-data-measuring-fixed-broadband-tenth 

  13. FCC Office of Engineering and Technology, Measuring Broadband Raw Data Releases - Fixed. Accessed 2021-06-28 at: https://www.fcc.gov/oet/mba/raw-data-releases 

  14. Kahan, John; Lavista Ferres, Juan. United States Broadband Usage Percentages Dataset. Accessed 2021-06-23 at: https://github.com/microsoft/USBroadbandUsagePercentages#readme 

  15. Cooper, Tyler. How To File an FCC or FTC Complaint About Your Internet. BroadbandNow website, published 2021-04-07. Accessed 2021-06-28 at: https://broadbandnow.com/guides/how-to-file-fcc-ftc-internet-complaint 

  16. To count the number of Ookla tests between 2020-01-01 and 2020-06-30, Ookla public data was loaded into BigQuery tables, and tests within the US were counted using geometry of the US national outline provided by BigQuery Public Datasets. The complete query used can be accessed at: https://console.cloud.google.com/bigquery?sq=754187384106:7ebf5785dbb34e149713c43a159b819f. To count the number of NDT tests in the same time range, NTIA provided the two queries used to extract median speed data per US County and US Territories. Small modifications to these queries were made to sum daily test counts of upload and download tests, and the final counts for both were added together. The queries used may be accessed at: US Counties, US Territories 

Back to Top