Chapter 11: From Broadcast.com to Today: Why Live Streaming Still Faces a 30-Second Delay

Chapter 11: From Broadcast.com to Today: Why Live Streaming Still Faces a 30-Second Delay

Streaming technology in the 90’s was mostly either packet based UDP (User Datagram Protocol) or, fairly rarely (though, not for lack of trying), multicast. ?Both methods break up a signal into pieces (packets), and send them to a destination server or user. Sometimes packets can end up travelling via different paths and they arrive out of order, or, late, or not at all. The reassembly and replay takes time, and introduces latency, often 30 seconds or even longer.

If UDP was blocked by a firewall or network configuration (common in corporate or restrictive environments), RealNetworks’ RealPlayer would revert to TCP streaming, usually on port 80 (HTTP), known as HTTP tunneling. Although TCP guaranteed packet delivery, this introduced additional overhead and increased latency even more because of retransmissions and acknowledgments.

Multicast Backbone (Mbone), established in 1992, was an experimental network designed to carry IP multicast traffic over the Internet. ?Multicast can sometimes experience higher latency compared to unicast UDP due to the additional processing required by network devices to manage multicast group memberships and distribute packets to multiple recipients.

For most applications, this doesn’t matter. Content Delivery Networks (CDNs) and edge network caching of today’s popular content can deliver on-demand content almost near-instantaneously. Think Netflix, etc. And, to be perfectly fair, does it matter if Bob in Dallas get’s his on-demand program delivered slower than Sally in Seattle? It’s not live, after all.

And let’s talk about live. Does Bob know or care that Sally’s ‘live’ program is ahead of his? After all, it’s not like they are actually at the live event and would notice the difference.

For live events, whether it is a rocket launch, political event, or football game, live streams today are still delivered using packet based protocols. The packets get reassembled at the user, and, it still often takes 30 seconds or even longer for those packets to get put back together and the program to play. If you’re not actually there, this doesn’t matter.

This lag might not matter when binge-watching a show, but for live events, where fans crave instant engagement, latency is a deal-breaker. Sports betting, for instance, hinges on split-second decisions. A 30-second delay can mean the difference between a winning bet and a useless one. Similarly, fantasy sports players and analysts need real-time player stats as the action unfolds, not 30 seconds later when the game has already moved on.

Of course, if you’re at the live event, why would you even want to watch or listen to a stream of the event you’re physically at? In a lot of cases, that’s perfectly true. However, let’s say you were at a college or pro football game and there are eighty thousand people in the stands and you want to listen to the play-by-play color commentary of the local radio station? Modern stadium architectures often impede AM and FM signal reception.

Beyond commentary, fans expect real-time access to in-depth stats, instant replay angles on demand, and even AI-generated insights piped directly to their devices. The expectation isn't just to watch the game but to experience it with enriched data that keeps up with the on-field action.

Also, who wants to listen to a stream that is thirty seconds or longer out of sync and delayed from the action taking place in front of you? I remember one of the first college games streamed and the wildly enthusiastic reaction from an alumni listening in Japan. He didn’t care that the stream had latency – he was just delighted to be able to listen at all!

In recent years, many stadiums have attempted to deploy Wi-Fi infrastructures and partner with tech companies to enable local, real-time streaming: think instant play-by-play commentary or live replays synced with the action. Yet, these efforts often stumble. Network congestion remains a beast: during the 2024 NFL season, some venues saw Wi-Fi grind under 20 TB of data per game, leaving streams buffering at critical moments.

Interference from personal hotspots and IoT gadgets further muddies the signal, pushing latency beyond the 30-second mark fans dread. Add in clunky authentication, like minute-long captive portal logins at places like SoFi Stadium, and spotty coverage in older venues with concrete jungles, and you’ve got a recipe for fan frustration. Far from seamless, these Wi-Fi woes mean the roar of the crowd often outpaces the stream in your ear.

The impact of these connectivity failures extends beyond just play-by-play audio. Major sportsbooks have invested in live in-game betting, but delays caused by overloaded networks mean odds can shift before a fan’s wager even registers. Sports broadcasters and leagues have experimented with interactive elements, such as multiple camera angles or real-time analytics, but without low-latency infrastructure, these features fall flat.

RealNetworks used UDP-based streaming for latency-sensitive services in the 90s, but today’s standard on-demand streaming platforms rely primarily on TCP-based HTTP streaming to ensure smooth and reliable playback. UDP remains prominent primarily in real-time, interactive, or latency-critical applications rather than conventional on-demand services.

In the world of streaming, technological advancements have brought us 4K video, real-time gaming, and even immersive AR experiences. Yet, a stubborn issue persists: latency. Live streams often lag 30 seconds or more behind the actual event. For most content, this delay is negligible. But for live sports, betting, and time-sensitive events, it’s a critical flaw.

It’s not just about listening or watching, it’s about engagement. The rise of interactive streaming experiences, such as Twitch-style watch parties for sports, relies on real-time delivery. When a goal is scored or a touchdown is made, fans expect to react together, not on a 30-second delay. Streaming technology needs to bridge this gap if it hopes to meet the demands of modern sports and entertainment audiences.

This article explores not only the causes of this latency but also examines which platforms use what technologies to address (or not address) this challenge, and why these decisions are made.

How Streaming Platforms Handle Latency

At broadcast.com, as previously discussed, we used both Realnetworks player and server technology as well as Microsoft’s. We did play around with Multicast quite a bit. We hoped that it would help solve one of our biggest problems: streaming costs, and, secondarily, build a robust distribution infrastructure. In short, however, it had too many problems and never delivered on the promise.

As for lag-time, keep in mind this was a time when most people were still on dialup modems. If you were a geek like Mark or myself you had an ISDN line with an expensive router. Our biggest streams were the occasional 300k, but most were 28.8k and 56k streams. Still, lag time back then was not terribly different than it is now. Just smaller streams that had to be reassembled instead of today’s big fat ones. The quality, though, was often poor and nothing remotely like today’s ‘4k’ streams.

Different platforms face unique challenges with latency, and their choice of technology often reflects their priorities, whether it’s global scalability, low latency, or user experience.

YouTube Live

YouTube Live employs HTTP-based streaming protocols, primarily HTTP Live Streaming (HLS) and Dynamic Adaptive Streaming over HTTP (DASH), to deliver content to a wide array of devices. These protocols are designed for scalability and reliability, ensuring that streams function seamlessly across various platforms. However, their chunk-based architecture can introduce latency ranging from 10 to 30 seconds, which may affect real-time viewer engagement. (https://support.google.com/youtube/answer/7444635)

To address latency concerns, YouTube has implemented Low-Latency modes for both HLS and DASH in specific live-streaming scenarios. These enhancements have successfully reduced delays to approximately 5 to 7 seconds, thereby improving the interactivity between streamers and their audiences.

If you’re there, though, at a game, 5-7 seconds is a loooong time (much less 30 or more):

  • In five seconds, an NFL wide receiver can sprint 50 yards downfield, turning a short pass into a dramatic play.
  • In five to seven seconds the ball snaps, the quarterback scrambles, evades pressure, and launches a deep pass into the end zone.
  • Five seconds can decide a stolen base, from the pitcher's release, the catcher's throw, to the runner sliding safely under the tag.
  • In seven seconds, an entire fast break sequence can take place: steal, pass, sprint, and alley-oop dunk.

Who would want to be at the game watching, while listening to the play-by-play, long after the action is over and the next play has already started?

Despite today’s advancements, low-latency streaming is not yet the default setting across the entire platform, indicating ongoing efforts to balance latency reduction with overall stream stability and quality. (https://support.google.com/youtube/answer/7444635)

Summary:

  • Technology Used: HLS (HTTP Live Streaming), DASH (Dynamic Adaptive Streaming over HTTP)
  • Why: YouTube focuses on scale and reliability. By relying on HTTP-based protocols, YouTube ensures its streams work across a broad range of devices. However, these protocols, with their chunk-based architecture, inherently introduce delays up to 30 seconds.
  • Latency Solution Efforts: YouTube has introduced Low-Latency HLS and DASH for certain live-streaming events, reducing delays to as low as 5-7 seconds, but this isn’t yet standard across the platform.

Twitch

Twitch employs Low-Latency HTTP Live Streaming (HLS) to facilitate real-time interaction between streamers and viewers, achieving delays as short as 2-3 seconds. This is crucial for maintaining the platform's interactive experience, enabling timely audience engagement during live broadcasts. (https://blog.twitch.tv/en/2021/10/25/low-latency-high-reach-creating-an-unparalleled-live-video-streaming-network-at-twitch/)

However, prioritizing low latency can lead to increased buffering, particularly for users with unstable or slower internet connections. The reduced buffering window inherent in low-latency streaming allows less time to compensate for network inconsistencies, potentially resulting in playback interruptions. To mitigate this, Twitch offers streamers the option to disable low-latency mode, thereby increasing the buffer duration to enhance playback stability for viewers with suboptimal connections. (https://help.twitch.tv/s/article/low-latency-video) Summary:

  • Technology Used: Low-Latency HLS
  • Why: Twitch’s focus is on real-time engagement between streamers and viewers, making latency a critical issue. By implementing Low-Latency HLS, Twitch has reduced delays to as little as 2-3 seconds in many cases.
  • Challenges: The trade-off is in buffering. Users with poor connections may experience more interruptions as the platform prioritizes low delay over smooth playback.

Zoom

Zoom utilizes Web Real-Time Communication (WebRTC) technology to facilitate its video conferencing services. WebRTC is an open-source framework that enables real-time audio, video, and data sharing directly between browsers and devices without the need for additional plugins. This technology is designed to support low-latency communication, which is essential for applications like video conferencing where timely interaction is critical. (https://www.wowza.com/blog/what-is-webrtc)

However, WebRTC's peer-to-peer architecture presents scalability challenges. While it excels in facilitating direct communication between a limited number of participants, scaling this model to accommodate large audiences, such as in webinars or live broadcasts, can be problematic. The direct connection approach can strain network resources and lead to performance issues as the number of participants increases. (https://www.wowza.com/blog/what-is-webrtc)

To overcome these limitations, platforms like Zoom implement additional infrastructure, such as Selective Forwarding Units (SFUs), to manage media streams more efficiently. SFUs act as intermediaries that receive media streams from participants and selectively forward them to others, reducing the number of direct connections each participant must manage. This approach helps maintain low latency while improving scalability, allowing platforms to support larger meetings and webinars effectively. (https://www.liveswitch.io/blog/how-to-successfully-scale-your-webrtc-application-in-2021)

Summary:

  • Technology Used: WebRTC (Web Real-Time Communication)
  • Why: For Zoom, low latency is essential. Its reliance on WebRTC, a peer-to-peer communication protocol, allows near-instantaneous video and audio delivery for meetings.
  • Challenges: WebRTC isn’t scalable for massive audiences due to its reliance on direct connections, making it unsuitable for platforms like YouTube or Netflix.

Netflix

Netflix employs Dynamic Adaptive Streaming over HTTP (MPEG-DASH) in conjunction with the Transmission Control Protocol (TCP) to deliver its vast library of on-demand content. MPEG-DASH is an adaptive bitrate streaming technique that enables high-quality streaming of media content over the Internet delivered from conventional HTTP web servers. This combination ensures that each data packet is transmitted accurately, thereby minimizing buffering and optimizing visual quality for viewers. (https://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP)

In the context of on-demand streaming, latency, the delay between a user's action and the response, is less critical compared to live streaming scenarios. A delay of even 30 seconds is inconsequential for Netflix users, as the content is pre-recorded and does not require real-time interaction. Consequently, Netflix prioritizes reliability and high-quality playback over low latency, ensuring a seamless viewing experience for its subscribers.

Summary:

  • Technology Used: DASH with TCP
  • Why: Netflix prioritizes reliability and quality over low latency because it specializes in on-demand content. Using DASH with TCP ensures every packet of data is delivered accurately, minimizing buffering and optimizing visual quality.
  • Latency Relevance: For Netflix, a 30-second delay is irrelevant since their content isn’t live.

Sports Streaming Platforms (e.g., DAZN, ESPN+)

Sports streaming platforms, such as DAZN and ESPN+, prioritize delivering live content with minimal delay to enhance viewer engagement and maintain the excitement of real-time events. To achieve this, they employ technologies like Low-Latency HTTP Live Streaming (LL-HLS), Dynamic Adaptive Streaming over HTTP (DASH), and proprietary protocols designed to reduce latency. For instance, DAZN has expanded its partnership with LTN to deliver low-latency live streaming channels to betting locations, utilizing LTN's managed IP network that offers 99.999% reliability and sub-300ms latency. (https://www.sportsvideo.org/2024/12/10/dazn-ltn-expand-partnership-to-deliver-11-low-latency-live-streaming-channels-to-betting-locations/)

Despite these technological advancements, maintaining consistent low-latency streams for millions of simultaneous viewers presents significant technical and economic challenges. Scaling infrastructure to handle high traffic volumes without compromising stream quality requires substantial investment in robust content delivery networks (CDNs) and efficient encoding processes. Additionally, varying network conditions across different regions can lead to buffering or reduced video quality, impacting the overall viewer experience. (https://www.sportspro.com/insights/gcore-cdn-tech-streaming-ultra-low-latency/)

Summary:

  • Technology Used: Low-Latency HLS, DASH, Proprietary Protocols
  • Why: Sports streaming platforms face the dual challenge of maintaining high-quality streams while minimizing delay. Many use Low-Latency HLS or proprietary technologies designed to reduce chunk sizes and optimize CDN delivery.
  • Challenges: Achieving consistency for millions of viewers watching simultaneously remains a technical and economic hurdle.

Real-Time Betting Platforms

Real-time betting platforms prioritize ultra-low latency to ensure fair play and maintain the integrity of live wagering. To achieve this, they commonly utilize the Real-Time Transport Protocol (RTP) and Web Real-Time Communication (WebRTC). RTP facilitates the rapid transmission of audio and video data, while WebRTC enables peer-to-peer communication directly between browsers, minimizing delay. These technologies are essential for delivering the immediacy required in live betting scenarios. (https://www.wowza.com/blog/streaming-protocols)

However, scaling these technologies to accommodate large audiences presents significant challenges. WebRTC, for instance, was originally designed for direct communication between a limited number of participants, and scaling it for mass streaming requires complex infrastructure and substantial computational resources. This complexity can lead to increased costs and potential performance issues, making it less suitable for mainstream streaming platforms that serve vast audiences.

To overcome these limitations, some platforms are exploring hybrid approaches that combine WebRTC with other streaming technologies or implementing advanced architectures to enhance scalability. Despite these efforts, achieving the necessary scale for widespread adoption in mainstream streaming remains a complex and resource-intensive endeavor. ?(https://blog.phenixrts.com/a-better-way-to-scale-webrtc)

Summary:

  • Technology Used: RTP (Real-Time Transport Protocol), WebRTC
  • Why: Platforms that support live betting prioritize ultra-low latency to ensure fair play. RTP and WebRTC are commonly used due to their ability to transmit data with minimal delay.
  • Challenges: These technologies struggle to scale beyond small, dedicated audiences, making them unsuitable for mainstream streaming.

Why Platforms Make These Choices

Streaming platforms carefully select their streaming technologies based on specific priorities and operational considerations:

Scalability vs. Real-Time Needs

Platforms like YouTube and Netflix prioritize scalability and reliability to serve a global audience. Their primary focus is on delivering high-quality content efficiently, where real-time interaction is not essential. Consequently, they utilize protocols such as HTTP Live Streaming (HLS) and Dynamic Adaptive Streaming over HTTP (DASH), which, while introducing higher latency, offer robust performance and broad compatibility across devices. (https://www.wowza.com/blog/streaming-protocols-latency)

Interactivity and Community Engagement

Platforms such as Twitch, Zoom, and real-time betting services cater to users who require immediate feedback and real-time interaction. To facilitate this, they employ low-latency streaming protocols like Web Real-Time Communication (WebRTC) and Real-Time Transport Protocol (RTP). These protocols minimize delay, enhancing user engagement and interactivity. However, achieving low latency often involves trade-offs in terms of reliability and compatibility, as these protocols may be more susceptible to network variations and may not be supported across all devices. (https://www.gumlet.com/learn/webrtc-vs-hls/)

Infrastructure Costs

Implementing low-latency protocols such as WebRTC and RTP necessitates substantial investment in infrastructure to ensure scalability and performance. The peer-to-peer architecture of WebRTC, for example, can lead to increased complexity and resource consumption when scaling to large audiences. For many platforms, especially those serving vast user bases, the costs associated with deploying and maintaining such technologies on a global scale may outweigh the benefits of reduced latency. As a result, they opt for more scalable solutions that, while introducing higher latency, offer greater reliability and cost-effectiveness. (https://cloudinary.com/guides/live-streaming-video/low-latency-hls-ll-hls-cmaf-and-webrtc-which-is-best)

Emerging Solutions: Who’s Leading the Way?

To address the challenge of balancing scalability with real-time performance, streaming platforms are investing in innovative technologies aimed at reducing latency and enhancing user experiences.

Edge Computing

Platforms such as Twitch and ESPN+ are adopting edge computing to process data closer to viewers, thereby reducing the physical distance content must travel. By decentralizing data processing and bringing it to the network's edge, these platforms can minimize latency introduced by traditional Content Delivery Network (CDN) routing. This approach not only reduces latency but also improves bandwidth efficiency and overall performance, offering a pathway toward real-time streaming at scale. ?(https://www.muvi.com/blogs/role-of-edge-computing-in-video-streaming/)

AI-Driven Optimization

Streaming services are increasingly leveraging artificial intelligence (AI) to optimize various aspects of content delivery. For instance, platforms like YouTube utilize AI to adjust streaming quality in real-time based on the viewer's internet connection, ensuring smooth playback even on low-bandwidth connections. Additionally, AI is being explored to predict and mitigate network congestion, helping to deliver low-latency streams without compromising quality. (https://www.forbes.com/sites/neilsahota/2024/03/18/streaming-into-the-future-how-ai-is-reshaping-entertainment/)

Proprietary Protocols

Some sports streaming platforms, such as DAZN, are experimenting with proprietary protocols that combine elements of existing streaming technologies to balance low latency with reliability. For example, DAZN has expanded its partnership with LTN to deliver low-latency live streaming channels to betting locations, utilizing LTN's managed IP network that offers sub-300ms latency. While promising, these solutions are often confined within specific ecosystems, limiting broader adoption. (https://www.forbes.com/sites/neilsahota/2024/03/18/streaming-into-the-future-how-ai-is-reshaping-entertainment/)

5G Networks

The advent of 5G technology presents significant opportunities for mobile streaming platforms. With its ultra-low latency capabilities, 5G enables real-time streaming experiences that were previously unattainable on mobile devices. Platforms like Twitch and YouTube Live are piloting 5G-powered real-time streams in select regions, aiming to enhance mobile-first applications such as augmented reality (AR) and live sports betting. This development could be a game-changer, providing the necessary infrastructure to support high-quality, low-latency streaming on the go. (https://www.verizon.com/business/resources/articles/s/how-edge-computing-can-help-improve-audio-and-video-streaming-technology/)

In summary, by investing in edge computing, AI-driven optimization, proprietary protocols, and 5G networks, streaming platforms are taking bold steps to bridge the gap between scalability and real-time performance, ultimately enhancing the viewing experience for their audiences.

Conclusion: The Latency Balancing Act

Latency remains one of the most persistent challenges in the streaming industry. While some platforms have made strides toward reducing delays, the trade-offs between scalability, reliability, and cost continue to hinder the widespread adoption of ultra-low-latency solutions.

As new technologies like edge computing, AI-driven optimization, and 5G gain traction, the gap is closing, but at a slow pace. Solving the latency problem at scale will require collaboration across industries and significant investment in infrastructure. Until then, platforms will continue to navigate the balancing act between what’s possible and what’s practical.

What do you think will drive the next big leap in low-latency streaming? Let’s discuss!

? Patrick Seaman, 2025 All Rights Reserved.

#StreamingLatency #LiveStreaming #RealTimeStreaming #LowLatency #StreamingTechnology #Twitch #YouTubeLive #WebRTC #HLS #EdgeComputing #5GStreaming #SportsStreaming #RealTimeEngagement #ContentDeliveryNetworks #CDN #FutureOfStreaming #LiveSports #InteractiveStreaming #StreamingInnovation #TechInfrastructure

References

1.????? https://support.google.com/youtube/answer/7444635

o??? Description: Official YouTube Help page titled "Manage live stream latency," detailing how YouTube Live handles latency with options like Normal, Low, and Ultra-low latency modes for HLS and DASH streaming. Referenced twice in the YouTube Live section.

2.????? https://blog.twitch.tv/en/2021/10/25/low-latency-high-reach-creating-an-unparalleled-live-video-streaming-network-at-twitch/

o??? Description: Twitch blog post discussing their low-latency streaming network, emphasizing Low-Latency HLS and its impact on real-time engagement.

3.????? https://help.twitch.tv/s/article/low-latency-video

o??? Description: Twitch Help article explaining how to enable low-latency mode, its benefits (2-3 second delays), and trade-offs like potential buffering for viewers with poor connections.

4.????? https://www.wowza.com/blog/what-is-webrtc

o??? Description: Wowza blog post explaining WebRTC, its use in real-time communication (e.g., Zoom), and its low-latency capabilities. Referenced twice in the Zoom section.

5.????? https://www.liveswitch.io/blog/how-to-successfully-scale-your-webrtc-application-in-2021

o??? Description: Liveswitch blog post discussing scaling WebRTC with Selective Forwarding Units (SFUs), relevant to Zoom’s infrastructure enhancements.

6.????? https://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP

o??? Description: Wikipedia entry on MPEG-DASH, detailing its use in Netflix for adaptive bitrate streaming over TCP, focusing on reliability over latency.

7.????? https://www.sportsvideo.org/2024/12/10/dazn-ltn-expand-partnership-to-deliver-11-low-latency-live-streaming-channels-to-betting-locations/

o??? Description: Sports Video Group article on DAZN and LTN’s partnership for low-latency streaming (sub-300ms) using proprietary protocols and managed IP networks.

8.????? https://www.sportspro.com/insights/gcore-cdn-tech-streaming-ultra-low-latency/

o??? Description: SportsPro article on Gcore’s CDN technology for ultra-low-latency streaming, highlighting challenges in scaling for sports platforms like DAZN and ESPN+.

9.????? https://blog.phenixrts.com/a-better-way-to-scale-webrtc

o??? Description: Phenix blog post discussing improved methods to scale WebRTC for real-time betting platforms, addressing its limitations for large audiences.

10.? https://www.wowza.com/blog/streaming-protocols-latency

o??? Description: Wowza blog post comparing streaming protocols (HLS, DASH, WebRTC, RTP) and their latency implications, explaining platform choices for scalability vs. real-time needs.

11.? https://www.gumlet.com/learn/webrtc-vs-hls/

o??? Description: Gumlet article comparing WebRTC and HLS, highlighting trade-offs between low latency (WebRTC) and compatibility/reliability (HLS) for interactive platforms.

12.? https://cloudinary.com/guides/live-streaming-video/low-latency-hls-ll-hls-cmaf-and-webrtc-which-is-best

o??? Description: Cloudinary guide comparing LL-HLS, CMAF, and WebRTC, discussing infrastructure costs and why platforms opt for scalable, higher-latency solutions.

13.? https://www.muvi.com/blogs/role-of-edge-computing-in-video-streaming/

o??? Description: Muvi blog post on edge computing’s role in reducing latency for platforms like Twitch and ESPN+ by processing data closer to viewers.

14.? https://www.forbes.com/sites/neilsahota/2024/03/18/streaming-into-the-future-how-ai-is-reshaping-entertainment/

o??? Description: Forbes article by Neil Sahota on AI’s role in streaming optimization, including real-time quality adjustments and congestion prediction. Referenced twice for AI and proprietary protocols.

15.? https://www.verizon.com/business/resources/articles/s/how-edge-computing-can-help-improve-audio-and-video-streaming-technology/

o??? Description: Verizon Business article on edge computing and 5G enhancing streaming latency, relevant to Twitch and YouTube Live’s 5G pilots.

?

JOHN ROE

Founder, Chairman, President @ SportsBug? | Entrepreneur

1 周

What's the answer to latency issues at live sports events? Let's hear it for The SportsBug Network! (Pun intended :-))

回复

要查看或添加评论,请登录

Patrick Seaman的更多文章

社区洞察