Raising the Standard: The Broadband Speed Debate
Next Tuesday (March 12, 2024) I have the honor of moderating an esteemed panel, ‘Raising the Standard: The Broadband Speed Debate’, at Connected America (CA) in Dallas, Texas. Our topic is the evergreen issue of the broadband speed standard. In an unexpected, but welcomed coincidence, the FCC recently released its draft Section 706 Report – its statutorily required inquiry of internet availability and deployment.
The FCC’s report is notable for several reasons. I won’t spend too much time on it here, but the Commission’s proposed findings and recommendations are to raise its fixed speed benchmark to 100/20 Mbps, its first-time use of its Broadband Data Collection (BDC) deployment data to measure availability, to establish a long-term speed goal of 1 Gbps/500 Mbps, and to conclude that broadband “is not being deployed in reasonable and timely fashion”. For many broadband advocates and stakeholders, the report’s findings are welcome but overdue.
For our panel, the release of the draft report, to be voted on at the FCC’s March 14 meeting, provides a timely opportunity to discuss the impact of raising the standard on federal policy, on the allocation of federal funding, and on affordability. Based on the makeup of the panel I expect a spirited and entertaining exchange.
However, before we tackle those weighty issues, I want to take a step back and address an issue that has bothered many for years: how we measure broadband service. The broadband speed standard has been hotly debated for years and often the default unit of measurement when assessing availability and establishing service requirements for grant programs. We’ve been chasing speeds for years and it still doesn’t accurately capture the state or health of internet access in our country.
On a positive note, the greater public better understands the need for speed and its impact on their ability to work, educational opportunities, and access to the economy and society.* So, are we done with the singular focus on megabits per second? Shouldn’t we move on to a more thoughtful dialogue about how we should measure broadband availability and consider reliability, latency, jitter, affordability and maybe a few criteria?
*The report finds: “As of December 2022, the mean download speed for all residential fixed broadband subscriptions was 439 Mbps while the median residential download speed was 300 Mbps, and nearly 79% of all residential subscriptions had a download speed of at least 100 Mbps.”
?At the local level, especially in rural areas, often the biggest complaint isn’t slow internet, but reliability-service isn’t available at certain times of the day or is unpredictable (randomly goes down). Reliability may be criticized because it is difficult to define and measure, and it is often affected by customer equipment and devices.
At Treasury, the Capital Projects Fund (CPF) and State and Local Fiscal Recovery Funds (SLFRF) guidance included reliability as a project requirement and as a criterion to consider the actual customer experience-meaning the speeds customers were “actually and consistently” getting. In FAQs Treasury further encourages states to use their discretion to evaluate whether a service is reliable and provides suggested criteria and methods states could use to evaluate reliability. This establishes precedent and provides guidance for using reliability to measure service.
We’d have to address the variables that come with speed and latency tests like customers’ home equipment, devices, device software, etc. But most of these factors are now discoverable and can be accounted for by newer, more robust speed tests.
领英推荐
Of course, affordability ranks near the top of barriers to access. Because affordability has been a concern for many years, we are developing better data to quantify and measure it. This will make it easier to incorporate in a methodology to evaluate service.
Weaving in new criteria will certainly complicate the process for updating a benchmark standard for availability. But it’s not impossible and could give us a more accurate picture of broadband access in the US. In North Carolina, as with many other states, advocates and stakeholders, we learned that meaningful access includes reliability motivating us to adopt a goal for universal service to mean “reliable, affordable high-speed internet service” for all.
I’m skeptical that the FCC will make this type of change due to legal, logistical and political reasons. For one, it will, as it typically does when side-stepping an uncomfortable issue that doesn’t advance its agenda, point to the statutory limitations of Section 706 of the Telecommunications Act . This law doesn’t specifically authorize the Commission to define and measure reliability, affordability and the other criteria mentioned here.
Second, as the Commission pointed out in its draft report: “Our decision not to adopt a symmetrical 100/100 Mbps benchmark is heavily influenced by the standards that Congress established for determining inadequately served locations for the BEAD Program.” Dramatically upping the standard now may be disruptive. Although, the report doesn’t expand on how raising the standard beyond that established in the Infrastructure Investment and Jobs Act will impact, negatively or positively, BEAD or other grant programs. It’s almost as if the sole evidence presented to support the Commission’s decision is: “It’s complicated.”
Finally, organizations, public or private, need to show forward movement. It is very likely adopting additional criteria, some of which may be viewed as subjective or region-dependent, will show that the “availability of advanced telecommunications capability” is not available to a far greater number of Americans than currently reported. Not necessarily the fault of the FCC, but we all know what happens to messengers. (To its credit the report does conclude that broadband is not being “deployed to all Americans in a reasonable and timely fashion.”)
This does raise another issue-the growing obsolescence of the FCC on standard setting and broadband policy. Not to be too critical of the Commission, but its rationale, in part, for raising the speed standard is:
"To make this determination interpret the definition by examining trends in providers’ speed offerings (that is, what they are deploying to American households), what speeds are required to use various common applications, and data regarding what speeds consumers are adopting when they have the option to purchase various speeds. We believe that looking at these factors, along with other relevant programs and recent Congressional action, remains helpful to evaluate the benchmark." (emphasis added)
The report mentions other federal grant programs’ standards several times. Instead of leading on this issue the FCC seems to be following, if not begrudgingly acquiescing, to other federal and state program policies. BEAD, CPF, USDA and many state broadband grant programs established a higher speed standard years ago (See Minnesota’s Border-to-Border grant program).
Despite its statutory authority, it failed to lead and abdicated standard setting to others. These others, like federal grant making agencies and states, were tired of waiting. It may take an act of Congress to move service and deployment goal setting to a policy-making body, e.g.: NTIA, and leave inquiries and measuring to a regulatory body.
But states aren’t waiting to be told. Now that states have greater knowledge and money, they’ll likely set the standard and determine when all of their citizens have access to reliable, affordable “advanced telecommunications” service.
The discussion should not focus on speed. Speed does not address capacity or capability, it only applies to copper and coax vs fiber. The solution is fiber based and a combination of wireless to reach the difficult areas. Copper and coax cannot be segmented and prioritized so it is incapable of delivering next generation services. The FCC made a critical mistake in categorizing 10/1 mbps as “broadband” during the Obama administration. Billions of $$$ were wasted on inferior services that could have been used to deploy fiber infrastructure. Pleading ignorance was hardly an excuse as fiber had been actively deployed in the major metros since the 1990’s.
Vice President - Government at Vivacity/EX2. Former State Broadband Director for the State of Arizona
8 个月Looking forward to watching this one!
Vice President of Technology & Communications at Irby Utilities
8 个月Sounds like an interesting panel, I hope you get into quality of experience as the the real goal, understanding that by and large network speed/capacity is a proxy for quality. This is because in someone is using their connection to its limit there will be contention for traffic which leads to poor experience.