It’s becoming more generally accepted in the streaming video industry that having more data can provide greater insight. With better graphs, better algorithms, better analytics, more improvements can be made. As such, many streaming operators look for new tools or providers to make more data from their technology stack available. However, the result can be overwhelming: a dozen different interfaces all providing detail and insight into a different aspect of the streaming workflow. Of course, none of them are connected together. 

Unfortunately, this is an all-too-common approach to streaming video data monitoring. Operators put analytics first, seeking a way to visualize a data source without having an overall strategy for data in general. 

What’s needed first, before meaning can be derived, before any attempt at analysis, is a video data platform.

What Are Analytics?

Analytics, simply put, is the use of visualization tools to display data in a form that can be analyzed. Although it’s possible to analyze data points within a table, the concept of analytics, especially within streaming video, usually includes some sort of dashboard or graphical interpretation of the data. 

But analytics does not necessarily map to insight. That’s because analytics, in the current state of the streaming industry, is often carried out against siloed data. Each source of data often has its own tool or dashboard that makes “sense” of the data itself. Of course, these are helpful. Just looking at data tables doesn’t reveal much. Yet because each dashboard is independent, true insight must be inferred by looking at multiple tools. This increases not only the complexity of deriving meaning from the data, such as root-cause analysis but it also significantly increases the time it takes.

Putting the Cart Before the Horse

When streaming operators focus on analysis without a video data platform strategy, they get short-term gains at the cost of long-term benefits. For example, with access to CDN log data through the CDN visualization, a network operations engineer may be able to ascertain a low cache efficiency on a particular piece of content. But the cause of that may not be the CDN at all. Rather, it may be a bad encode for a specific bitrate in the bitrate ladder. Without an overall data platform strategy, which would include a means to relate the CDN log data to the encoder data and even other sources like the player, the analysis is only partially helpful. The low cache-hit ratio reveals a problem. With some help from the CDN operations engineers, the streaming operator may be able to discover that it’s a result of the encoder. This kind of approach is repeated over and over again as new data sources from streaming stack technologies are made available. It’s an analysis-first mindset.

When You Put Strategy First

Rather than looking at how to visualize each type of data, streaming operators need to implement a video data platform. A good video data platform is comprised of three layers:

  • Data retrieval and transport (Level 1)
  • Data normalization and standardization (Level 2)
  • Data relating and visualization (Level 3)

Video Data Retrieval and Transport (Level 1)

The first layer of a video data platform is getting the data from the tools. In many cases, this means programmatic access to log files or the tool’s database. Once access has been achieved, the data must be transported to a common location (i.e., a data lake). Most often, this is cloud-based, such as through a provider like Amazon Web Services or Google Cloud Platform, and has programmatic access built-in. Key to this as well is the speed at which data can be transported. Some data should be provided in real-time, such as QoE data from the player, while other data can take longer.

Data Normalization and Standardization (Level 2)

The second layer of the video data platform is a process by which to normalize and standardize the data. Many tools collect similar data points. For example, the average video player utilizes over 15 software development kits (SDKs) from various technology vendors. These may collect data points that are duplicative and need to be scrubbed, normalized, and de-duped. The streaming operator can build a machine-learning system on top of the data lake to take care of this normalization.

Data Relating and Visualization (Level 3)

The final layer of the video platform is making the connections between elements in different data sources and carrying out the calculations that are needed to derive meaning. This usually involves a mapping of data elements or utilizing a master table (based on standardized data elements) which links data elements between sources together under a single master element. This can often be accomplished through third-party tools like Tableau, Datadog, or Looker. These tools also provide visualization features so streaming operators can create customized dashboards for different business groups or roles.

A Streaming Video Data Platform Grows With the Business

The best part about making analytics a product of your video data platform is that you don’t have to rebuild everything when you want to include new data into your visualization. A video data platform is flexible by nature. The architecture is intended to facilitate new data sources by just connecting them through an API (which is a function of the platform itself). New logic can be added to the normalization and standardization layer, again not “rip and replace,” enabling new relationships to be created between different data sets which can be exposed through enhanced visualizations. The video data platform, then, becomes the foundation for all monitoring and analytics across the organization.

Datazoom: A Ready-to-Implement Video Data Platform

Of course, you can build all of this yourself. However, is building a video data platform your core business? As a streaming operator, probably not. Furthermore, you can’t just rely on any provider to supply something so fundamental to the health and success of your streaming business. You need a fire-tested, battle-hardened, proven platform to ensure that the data you need to provide the best possible video experience is available quickly, normalized to your business needs, and visualized for actionable business decisions.

The post A Video Data Platform First, Analytics Second appeared first on Datazoom.

Stay Up-to-Date on The Streaming Video Datatecture!

 

The Streaming Video Datatecture and microsite provide free and educational compilations of all of the companies and services involved in the streaming video industry. It provides a deep look into the ecosystem and layout, and explains the technologies that power video streaming. Created by Datazoom, the enterprise video data platform technology company, the goal is to help content owners, technology and service providers, investors, and other stakeholders better understand the data, systems, and leveled architecture that come together to create the data fabric of streaming video.

Contact Information

Recent datazoom news

Datazoom The Video Data Platform

Copyright 2021. Datazoom. Inc. All trademarks and logos are owned by their respective companies. Inclusion of a logo in the datatecture is not an endorsement by Datazoom, Inc. nor representative of any relationship between that company and Datazoom, Inc.
Scroll to Top