The Growth of OTT Demands a New Mindset From Streaming Providers

Streaming has grown immensely over the past 12 months. Viewing time for U.S. streaming services are 50 percent above 2019 levels in June, likely the result of services launched before and during the pandemic, including Apple TV+, Disney+, HBO Max (AT&T), and Peacock (Comcast). A new forecast projects SVOD subscribers will nearly double from the 650 million worldwide at the end of 2020 to 1.25 billion by the end of 2024. While the number of subscribers is growing, so too is the revenue opportunity. In 2020, the global video streaming market size was valued at USD 50.11 billion, and is expected to expand at a compound annual growth rate (CAGR) of 21.0 percent from 2021 to 2028.

So, what does this continued growth actually mean for the industry?ย 

The growth in the number of subscribers and resulting revenue is forcing OTT providers to take a hard look at how they provide their service. Unlike broadcast television where the network operator had visibility all the way down to the set-top box and could guarantee a three-, four-, or five-nines level of service, OTT doesnโ€™t have that luxury. They must often cobble together monitoring systems that link dozens of distinct and separate technologies together.ย 

To Support These Growing Services, OTT Providers Must Put Data First

Variables. Countless variables to track and monitor. This is the new norm in streaming.

When OTT first began, monitoring (and the data needed for observability into the streaming experience), was really a secondary thought. The primary focus was about reliability: keep the service up-and-running using whatever means necessary. Sometimes that involved a little bubblegum and Duct tape. But, as the previous stats support, OTT providers are now global. The demands of providing a consistent and reliable service on a global scale translates to data becoming a primary focus. This is the mindset driving todayโ€™s streaming platforms.

But this situation is more complicated than just elevating the importance of data in the operation of streaming platforms. There is no consistency around the data. Different vendors, providing different technologies within the video stack, all have their own data points. Sometimes, those data points, although named differently, represent the same variable. Itโ€™s an issue that complicates the entire process of holistically monitoring the workflow.ย 

Without a commitment to standardizing, enriching and centralizing information, the industry is contending with an explosion of data and no way to really put it to use. Itโ€™s like trying to capture all of the water from a leaky dam using a thimble. The result is that many operators are blocked from the OODA loop (Observe, Orient, Decide, Act). To put the video stack data to use, platform operators must be able to see the forestโ€ฆand the trees. They need to be able to identify patterns while also tracking down individual user sessions. But being able to do that requires consolidating the massive amount of fragmented data coming out of the workflow.ย 

Just consider these examples: at the device level, operators have to be conscientious of device OS, screen size, memory capacity, and supported protocols which can impact client-side caching algorithms, adaptive bitrate ladders, and changes to delivery protocol. Networks can experience sudden peaks in congestion, and so delivery adaptabilityโ€”such as choosing a different CDN within a specific regionโ€”becomes very important to ensuring a great viewer experience; and the content itself can be transformed to accommodate different connection speeds and bitrates.ย 

To address any of those examples requires not only a lot of data, but the variables must be standardized and normalized. Without that, there is no way to get a clear picture of how efficiently or effectively the technologies within the stack are operating, how well third-party partners such as CDNs are performing, and how the viewer is experiencing the video.

Standardized Data + Insights = Observability

The ultimate goal to elevating the importance of data within streaming platform operations is achieving observability. Just having a lot of data is nice. Having a lot of standardized data is better. But being able to derive insights, in real-time, enables the streaming operator to have the observability they need to provide the consistent, reliable, and scalable service their viewers expect. Furthermore, observability also provides the business with the ability to make critical decisions about advertising, marketing, and subscriber retention with more certainty and accuracy.

The Datatecture is a Data Landscape for Streaming Operators

Today, our industry needs to know how to optimize the creation, delivery, transformation, and operationalization of streaming video. Doing so requires harnessing all the data from within the video technology stack: operations, infrastructure, and the delivery workflow components. Each of the technologies, whether they are vendor supplied or open-source, can throw off data critical to seeing the delivery and QoE big picture, and the individual viewer sessions. But which technology throws off which data?ย 

Thatโ€™s where Streaming Video Datatecture comes in.ย 

Being in the industry for seven years, in order to talk about data, there needed to be a way to map out what was actually happening. There wasnโ€™t a single resource which showed all of the components in those three main stack categories, or which addressed the ever-changing technologies. Having a resource which provided an up-to-date picture of the technology landscape was a critical first step to harnessing all the data within the workflow.ย ย 

But the concept of the datatecture was more than just an industry landscape. It works as a tool which streaming operators can use to build their own datatectures. Because there is no standardization within the streaming industry, most OTT platforms are different. Every operator has figured out a way to make their technology stack work for them. But the increasing need for observability isnโ€™t specific to one provider. Every provider needs to put data first and to do that means understanding all of the technologies in the stack which can provide data to add to that observability. This industry-wide datatecture is a map which providers can use to build their own, envisions how their datatecture could be.ย 

Release video data from its silos: What Senior leaders can do

Although itโ€™s the engineers who will make the most use of the datatecture, executives and managers within the organization need to help with the โ€œdata firstโ€ transition. Hiringย  the right people, and ensuring they are focused on data, will help to make sure new software has data collection as a priority. Another strategy for senior leaders is to make sure all future hires, no matter the department, understand that stability and growth of the business relies on observability to drive business decisions. If all teams, whether that be in advertising, marketing, security, or content groups, understand the importance of removing data from its silos, then informed business decisions can be made.ย 

In many ways, data can help bring teams together. When everyone can speak the same language and have access to the same information, it naturally sparks more collaboration. Thatโ€™s not to say everyone needs to be aware of everything, but context is important. Thereโ€™s nothing like data to provide context in decisioning to ensure an organization, with all its stakeholders and moving parts, is going in the same direction.

Make Data (And the Datatecture) The Core of Your Streaming Business

The datatecture weโ€™ve created (and the datatecture you will create) should be at the epicenter of your streaming operations, software development, and business objectives. Without a deep understanding of the data role each component within your video stack plays within your observability, it will be far more difficult to make the business and technology decisions to drive your platform forward.

The post Observability, and the mindset and resources OTT must bring onboard to achieve it appeared first on Datazoom.

Scroll to Top