The capacity gap is here

Categories Edge Computing News  |  Guest Posts
The capacity gap is here

This is a guest post by Jeff Gilbert, Vice President of Strategy And Business Development, Content Providers, Qwilt.

This week brought an extraordinary confirmation that the biggest challenge facing OTT content delivery is not speed, as is commonly thought, but rather capacity. As you have probably heard by now, EU Commissioner Thierry Breton directly asked Netflix CEO Reed Hastings to stop streaming video in HD in order to ‘secure internet access for all.’ As a direct result of the pandemic, many more people than usual are at home watching Netflix and other streaming services and, in Thierry’s view, are effectively clogging the Internet. In response, Netflix agreed to lower their streaming bandwidth in the EU for 30 days, even though they believe their Open Connect service is already helping service providers, and that their streams already adapt based on available bandwidth.

What is important to note is that the whole conversation reflects an imperfect understanding of how video is delivered over the Internet. Not to make light of a serious situation, but it reminds me of an old joke where a guy walks into a doctor’s office and says, “Hey doctor. My arm hurts when I move it like this.” The doctor responds, “well don’t move it like that.” It solves the immediate problem, but kind of misses the point. Similarly, having Netflix lower their bandwidth solves the immediate problem, but kind of misses the point.

The reason there are bottlenecks is not that Netflix is streaming at too high a bandwidth, although that certainly exacerbates the problem, it is that most of the content from all kinds of content publishers is being cached on the wrong side of the ISPs’ core networks. CDNs typically cache OTT publishers’ content at data centers or just inside an ISP’s front door. Netflix content is not much different, even though they use Open Connect. Every single viewer of Netflix content, or anyone’s OTT content for that matter, is getting their own stream which must traverse the entire ISP network to get to the user. Even if these streams are identical, each one has to make the full trip from Netflix’ server or a CDN’s server all the way through the ISP to the end user.

ISPs, in general, have plenty of network capacity to each customer’s home. But there are two choke points that are now becoming more obvious. The first is the peering or interconnection links between the CDNs and the ISPs. While Commissioner Breton may not have been aware of it, these links are filling up. If your Disney stream isn’t looking as nice as usual, it may be because those peering links are being clogged by all the video conferencing going on. The second choke point is the ISPs’ core networks, which were never built to support such massive simultaneous streaming. Core networks are tremendously expensive to grow and maintain, so ISPs only build what is necessary. Core network capacity expansion projects also take a great deal of time, planning and execution to complete. They can’t just grow by an order of magnitude because everyone is suddenly staying at home.

The solution is to allow the ISPs to cache the video themselves. This is a decision ultimately made by the individual content publishers. Currently, either Netflix’ servers or a CDN’s servers are caching the content and often dumping millions of identical streams (one for each viewer) on the ISPs and essentially telling them, “Now it’s your problem. You deliver it.” No wonder things are getting clogged. If the ISPs are able to cache the content on their own caches, placed deep in their networks on the end users’ side of their core (also known as the access network), their core will remain open for unique traffic, like individual video conferences. The ISPs don’t need to massively grow their core networks. They just have to store the video in a better place.

The technology to solve this problem was created as an open specification by the Streaming Video Alliance, an industry organization whose members include Disney, Warner Media, Verizon and many other streaming experts. Called Open Caching, Qwilt has built a commercially available service based upon this specification. We named our service Content Delivery Sharing because it takes a page out of the sharing economy. It is sometimes referred to as “the Uber of content delivery.” Just like Uber doesn’t own the cars – the drivers do – Qwilt doesn’t own the caches – the ISPs do. In the analogy, the ISPs are the drivers and the content publishers are the riders. Many Tier 1 content publishers and ISPs have seen this problem coming and are already using Qwilt’s service to help alleviate the very bottlenecks described above.

Perhaps this crisis will hasten that adoption. We now live in a world in which the deluge of streaming media has exposed a capacity gap for all to see. This gap may be obvious now but, in our view, has been brewing for quite some time.

On a more sobering note: the pandemic we face is dire and, unfortunately, the pain and suffering that is already here for some and is yet to come for others is incredibly difficult. We wish everyone good health and safety during this time.

About the author

Jeff Gilbert is vice president of strategy and business development and leads Qwilt’s content delivery efforts, drawing on a long history working across the video distribution ecosystem, including as the founder of an early IPTV operator, serving in executive roles for multiple top-tier CDNs, as well as helping 3rd parties to utilize the Disney Streaming Services (Bamtech) tech stack for OTT delivery. Gilbert can be reached here.

DISCLAIMER: Guest posts are submitted content. The views expressed in this blog are that of the author, and don’t necessarily reflect the views of Edge Industry Review (

Article Topics

 |   |   |   |   |   | 


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Latest News