Chapter 7: Broadcast.com’s Streaming Odyssey: From CDNs to Yahoo’s Big Miss
?In the early days of streaming, latency was a significant challenge, often resulting in delays and buffering that hindered the user experience. To address this, I set up servers around the USA and across the globe, effectively creating an Edge Network. This strategic distribution network allowed us to route listeners and viewers to servers closer to their geographical locations, thereby reducing latency and enhancing streaming performance. Distributing content out to this network of edge servers also reduced the number of streams coming out of our Dallas based server farm, spreading the load. To my knowledge, this initiative marked the inception of the first Streaming Media Content Delivery Network (CDN).
Impact on Streaming Media
Implementing an Edge Network was a pivotal moment in the evolution of streaming media. By reducing latency and improving content delivery speeds, we were able to provide a more seamless and enjoyable experience for our audience. This innovation not only enhanced the quality of our streams but also set a precedent for future streaming platforms.
Approaching Live Events Like True Broadcasters: The Secret to Our Success
In the early days of AudioNet and later Broadcast.com, one of the most significant challenges we faced was the lack of robust streaming technology. The encoders and servers of the time were prone to (frequent) failure, often requiring manual rebooting to fix issues. These glitches would cause the stream to drop, forcing users to manually reconnect to continue listening or viewing.
At this point, most of our competitors treated these issues as simple IT problems. That is, when the encoder or server glitched, they would reboot the system, hoping the issue would resolve itself. But we took a very different approach. We decided to treat our live events like professional broadcasters would. In traditional broadcasting, if an issue arose during a live event, it wasn't simply fixed with a reboot and forgotten. Broadcasters had redundant systems, failovers, and backup plans in place to ensure that the show went on, WITHOUT INTERRUPTION, no matter what.
We implemented multiple levels of redundancy at every stage of our streaming process—from the servers that hosted the streams to the encoders that handled the media and even the staff managing the events. Beyond that, for high-profile events, we engineered multiple data pathways with multiple telco connections to deal with bandwidth problems.
I should point out that it was not unusual to have bandwidth problems. The networks simply weren’t built for the kind of uptime people have grown used to in later years. This was especially true when we broadcast live events from remote locations. Whether it was the first live streaming webcast from inside China or of the World Economic Forum from Davos or the Funeral of Princess Diana, or product launches from Microsoft and Intel from around the globe, getting redundant bandwidth in place, in advance, as well as shipping and setting up the huge amount of equipment needed was a logistical…. Challenge.
The late great former Intel Chairman Andy Grove once commented on our equipment we set up for his event in China, saying, “You could launch a moon rocket from here!”
We ensured there was always someone on hand to monitor the streams, address issues proactively, and keep the audience engaged. Even if a server or encoder failed, our backup systems would kick in without interrupting the broadcast. In many cases, we had redundant event staff available to address problems immediately, just as you would see in a traditional broadcast environment.
This approach significantly set us apart from the competition. While others were treating streaming like a technical fix, we were treating it like an ongoing, live production. This commitment to excellence and to treating every event like it was broadcast-quality content paid off. Our reputation for running smooth, reliable live events grew rapidly, and for a long time, live events became our largest revenue source.
Our attention to detail and professionalism during live events created a sense of trust with our audience. They knew that when they tuned in, they would experience a high-quality broadcast, generally free from interruptions or technical hiccups. That trust was critical in helping us build a loyal user base, attract more advertisers, and eventually scale our operations to new heights.
We were used to broadcasting a major Music Festival on behalf of a large customer. However, one year, a… competitor… offered to do it for substantially less. My contact at our customer was apologetic, however, I shrugged and asked him to call me afterwards and let me know how it went. Short version? My understanding was that it was a disaster. Among many issues, our … competitor, didn’t even know to get releases from the artists. ‘Nuff said. The client called back and we did it from that point forward.
The following image is an actual diagram from one of the more simple events we did, with only 56k and 28.8k encoders. Yes, remember, this was 1999:
Diagram: Simple Live Event from 1999 with two data paths and multiple encoders for redundancy.
I firmly believe that our decision to treat live-streaming events like traditional broadcasting—with full redundancy and professionalism—was a key factor in our early success. It wasn’t just about the technology; it was about delivering a consistent and professional experience to our users, which, in turn, helped establish Broadcast.com as a leader in the industry.
The Cost of Streaming in the 90s: A Key Challenge for AudioNet/Broadcast.com
At AudioNet and Broadcast.com, one of the most significant obstacles we faced was the cost of bandwidth. The technology was new, the infrastructure was limited, and the costs associated with broadcasting live events over the internet were hideously high. This made it one of the biggest challenges we had to overcome.
We broadcasted a mind numbing array of live events. It was honestly a blur. In sports that included everything from college football games to the MLB World Series to the Socer World Cup to the NFL Super Bowl, but we had to handle all aspects of content delivery, from encoding to streaming over telecommunications and satellite links as well as, in many cases, staff and equipment on site at the live event. The high costs of bandwidth, combined with the technical limitations of the period, made delivering high-quality streams a financial burden.
By 1999, it was clear that the infrastructure we were building was extensive and costly:
At the time, we were dealing with multiple bandwidth connections in the 45 Mbps to 155 Mbps range, with numerous 45 Mbps and 155 Mbps connections (multi-homed). As we were broadcasting major sports events, we were often looking at an audience in the hundreds of thousands. It was enormous at the time, but a far cry from today’s standards of nearly instant, unlimited-scale streaming.
The costs associated with scaling these infrastructures were compounded by the fact that server uptime and bandwidth weren’t as reliable as they are today. Encoders would frequently crash, and we were forced to manage both redundancy and failover systems just to ensure the stream remained live, requiring constant technical support.
Interestingly, I was recently reminded about how the Real Server end client limit was a number kept in a plain text config file, which we could then simply replace to match the network limit of the Dell server instead of the license restriction. And, of course, our servers were Dell Desktops sitting on top of shelves in racks (after the move to Taylor Street when we had an actual ‘data center’ room and not just machines piled on top of cafeteria tables until they were bowed down in the middle).
As I’ll discuss in a moment, one of the big reasons for reliability problems was due to instability in the RealNetwork platform.
Evolving Our Codec Standards and Platform: A Strategic Shift in Technology
In 1995, the landscape of internet streaming was still in its infancy, and the technology available to support it was often cumbersome and expensive. Competing codecs were in the market, some requiring proprietary hardware that drove up costs. The largest internet sites relied on high-end UNIX servers, and to run those servers, you needed specialized UNIX administrators—a niche skill set that was costly and hard to find.
However, at AudioNet, I made a strategic decision that would have a lasting impact on our business: I standardized us on PCs running Windows and Windows NT instead of UNIX systems. This decision was driven by the need to cut costs due to our threadbare startup funding and to streamline operations. By choosing Intel-based PCs, (later standardized on Dell Optiplex PCs), we could run Windows NT Servers, a much more affordable and user-friendly platform compared to the high-end UNIX systems that most other streaming services were using. Windows NT was point-and-click, which meant that we could hire less expensive techs to manage the servers and encoders rather than relying on highly specialized UNIX admins.
This decision not only made our operations more cost-effective but also opened the door to greater scalability and ease of maintenance.
There’s a story there. We had been doing live Intel product release webcasts for a while. Eventually, word got up the foodchain there and people started asking who the heck AudioNet was. My contact said that he’d like to bring some intel executives to see our shop.
Now, you need to understand something. By that time, when we were still shoehorned into Mark’s former bacheler pad at 2929 Elm Street, we were not a picture of… well…. Let me put it this way. We didn’t have racks. As I mentioned above, we had cafeteria tables laden with so many Dell PCs that they were bowed in the middle. Ethernet cables were strewn everywhere and dangling from the ceiling like so many jungle vines. Nothing was static as we had to constantly reconfigure every single PC sometimes, including the Secretary’s PC, for use in a broadcast. It was kind of a mad house.
There were times, later, when we were also using Windows Media, that we had to convert every machine to WM or Real overnight for some event or other, and then back again.
Needless to say, I couldn’t exactly say… “no” to Intel. On the other hand I figured as soon as they walked in the door they’d probably drop us and run away screaming. So to speak.
Instead, the look of wonder on their faces was a shock. Their question? “You mean, you run EVERYTHING on Intel based machines????”
The result? Long story, shorter. Intel’s involvement was so significant that they became our second capital investor, (after Motorola) recognizing the potential of the Intel-based Windows NT platform for internet broadcasting.
More than that, Intel recognized that AudioNet was creating a market among consumers who would want faster PCs so they could consume the content that we were streaming. PCs with Intel Inside. Intel wanted to sell more chips.
Strange Bedfellows: Microsoft & Broadcast.com
The strangest thing at the time, was the fact that Microsoft became a huge supporter. Before they began helping us, I had a constant worry that they could easily throw a ton of money at their own efforts and simply crush us. That they were ultimately going to be a huge competitor. Instead, they evidently saw us as a nimble startup they could leverage at minimal investment in order to make their Windows Media platform highly competitive with Real Media. Apparently, after Rob Glazer left Microsoft, there was no love lost, or at least, that was what folks like us thought.
领英推荐
Codec Evolution: From RealAudio to the Dual-Platform Strategy
When it came to selecting a codec, we initially relied on RealAudio—one of the first widely used codecs for streaming audio. RealAudio worked well for basic audio streaming, but it wasn’t without its issues. The codec had been originally written for UNIX platforms, and when ported over to Windows NT, it suffered from consistent memory leaks that caused reliability problems. Despite these challenges, we continued to use RealAudio because, at the time, it was one of the only really viable options available for internet audio streaming at scale.
However, as we continued to grow our operations and attract more attention and press, RealNetworks, began to compete directly with us for internet broadcasting business. Led by Rob Glaser, RealNetworks sought to capitalize on the growing demand for internet broadcasting. But despite their efforts, in the long run, RealAudio's codec wasn’t able to match the stability and reliability that was needed for large-scale streaming.
Microsoft recognized the growing potential of internet broadcasting, They were eager to support AudioNet in ways that RealAudio couldn’t, and introduced Windows Media Player and Windows Media Audio (WMA), offering a competing solution for internet streaming. Unlike RealAudio, Microsoft’s Windows Media codec was built to run natively on Windows NT, providing a much smoother experience and eliminating the memory leak issues we had faced.
As a result, we began to use both RealAudio and Microsoft’s Windows Media codec for our broadcasts. Microsoft’s active involvement was critical to our success. Not only did they provide the tools we needed to broadcast reliably, but they also provided significant and better technical support for their platform. Their active support gave us leverage against Real.? BTW, I don’t know when or if Real ever fixed their memory leak.?
The Impact of Codec and Platform Evolution on Streaming
The evolution of our codec standards and platform played a central role in our ability to scale and deliver reliable content. By standardizing on Intel-based Dell PCs running Windows NT, we created a more cost-effective and scalable system than the high-end UNIX-based solutions used by most competitors. Furthermore, our decision to adopt both RealAudio and Windows Media as our streaming codecs ensured that we were able to meet the needs of our growing audience while avoiding the technical limitations of any single solution.
Ultimately, the combination of Intel’s hardware, Microsoft’s software, and RealNetworks’ codec was instrumental in propelling us to new heights. These partnerships not only helped us solve our technical issues but also positioned us as the world’s leading internet broadcaster, delivering high-quality, reliable streaming to an ever-expanding audience.
The Yahoo Acquisition & The Missed Opportunity
In 1999, Yahoo acquired Broadcast.com for $5.7 billion. It was one of the largest tech acquisitions of the dot-com boom. The vision was to turn Yahoo into a media giant, leveraging our streaming tech.
However, several factors contributed to the deal's failure:
Over the years, I often heard it said that, “Yahoo is where startups go to die." It happened with broadcast.com, GeoCities Flickr, and even Tumblr.
Streaming’s Evolution: What’s Changed, What Hasn’t
The journey from AudioNet to Broadcast.com was about more than just technology—it was about reimagining how live content could be delivered on the internet. We pioneered real-time streaming, built one of the first CDNs, and approached live events with the redundancy and professionalism of traditional broadcasters. Our relentless focus on reliability and scale set the foundation for the streaming industry that thrives today.
Yet, despite nearly three decades of advancements—better compression, fiber networks, and cloud computing—some challenges remain. Latency still disrupts live sports, bandwidth costs remain high, and true real-time streaming remains an unsolved problem.
Streaming has transformed the way the world consumes content, but the mission isn’t over. The next frontier isn’t just about higher resolution or bigger libraries—it’s about eliminating delays and creating a seamless, real-time experience for audiences everywhere.
What are your earliest memories of online streaming? Did you experience AudioNet or Broadcast.com in action? Let’s talk about how far we’ve come—and where we’re going next.
Why This Matters Today
Streaming has become the dominant way people consume content, but the underlying technology still hasn’t fully solved the latency problem. In most cases, it doesn’t matter—no one notices a delay when watching Netflix.
But in live sports, real-time betting, and interactive broadcasts, latency kills the experience. This is the challenge that still needs solving—one that we’re working on today.
What are your earliest memories of online streaming? Back in the day, did you listen to AudioNet.com or watch broadcast.com programs and events? Did you watch YouTube in its early days? Let’s talk about how far we’ve come—and where we’re going.
? Patrick Seaman, 2025 All Rights Reserved.
?????? #Streaming #Broadcastcom #YouTube #Netflix #Smartphones #StreamingHistory #TechInnovation #LiveEvents #CDN #MediaEvolution
References
1.????? RealAudio Introduction "RealAudio, introduced by RealNetworks in April 1995, was a groundbreaking technology enabling audio streaming over the internet." https://en.wikipedia.org/wiki/RealAudio
2.????? Rob Glaser and RealNetworks "Rob Glaser founded RealNetworks after leaving Microsoft, driving innovation in streaming media with products like RealAudio." https://en.wikipedia.org/wiki/Rob_Glaser
3.????? Bulletin Board Systems (BBS) "Bulletin board systems were early text-based platforms for communication, later succeeded by community-driven streaming like Broadcast.com." https://en.wikipedia.org/wiki/Bulletin_board_system
4.????? Broadcast.com Overview "Broadcast.com pioneered internet streaming, enabling live sports, corporate events, and more to reach online audiences." https://en.wikipedia.org/wiki/Broadcast.com
5.????? Fraunhofer IIS and MP3 Development "The Fraunhofer Institute developed the MP3 format, revolutionizing digital audio compression." https://en.wikipedia.org/wiki/Fraunhofer_Society
6.????? Yahoo Acquisition of Broadcast.com "Yahoo’s $5.7 billion acquisition of Broadcast.com was one of the largest deals of the dot-com boom." https://en.wikipedia.org/wiki/Yahoo!
7.????? Why Yahoo Failed "Yahoo’s lack of strategic focus and ineffective integration of acquisitions led to its decline."
8.????? Windows Media Player and WMA Codec "Microsoft introduced Windows Media Player to compete with RealAudio, offering improved stability and performance." https://en.wikipedia.org/wiki/Windows_Media_Player
9.????? Codec Evolution "From RealAudio to Fraunhofer MP3, codecs were foundational to the early streaming revolution." https://en.wikipedia.org/wiki/Audio_codec
10.? The Concept of Pre-Roll Ads "Pre-roll ads, introduced in early streaming to address deep-linking challenges, are now a cornerstone of monetized streaming." https://www.advertisingweek360.com/the-evolution-of-pre-roll-ads/
11.? Internet Bandwidth in the 1990s "The high cost and technical limitations of bandwidth were major hurdles for early streaming platforms like Broadcast.com." https://en.wikipedia.org/wiki/Internet_bandwidth
12.? Microsoft and RealNetworks Rivalry "RealNetworks accused Microsoft of anti-competitive behavior, highlighting tensions in the early streaming market." https://www.cnet.com/tech/realnetworks-files-antitrust-suit-against-microsoft/