Are You Seeing the Forest For the Trees?
If you talk to streaming operators in the industry about what video data is important, they will often talk about QoE metrics: rebuffer ratio, time-to-first byte, cache-hit ratio. And, yes, those metrics are definitely important. But just having access to that player-level video analytics isn’t entirely helpful. It scratches the surface, illustrating the output of dozens of technologies used in the delivery and servicing of video viewer requests. It’s like not knowing anything about the production of a car, through a manufacturing line, and just seeing what comes out at the end. What if it doesn’t work? How can you diagnose the problems if you don’t know how the guts were assembled inside?
End-to-End Instrumentation is Key to Visualizing Video Analytics
In order to understand how all of the component technologies within the workflow influence a metric like rebuffer ratio, it’s crucial to monitor everything within the stack. You need to collect data from encoders, from packagers, from DRM servers, from server-side ad insertion, from CDN delivery, and more. Everything that’s involved in the workflow, from content ingestion to playback, is critical to getting a true picture of everything. In keeping with the title of this post, all of those data sources are the trees and your workflow is the forest.
So how do you see the forest? There are three key steps to any end-to-end instrumentation: data collection, data normalization, and data storage.
Data Collection
This is the most basic step to seeing the forest. If you can’t get to the data from each of the workflow components, you can’t get a complete picture. This may require programmatic connection, in the case of technologies like virtualized encoders which provide API access, or it may require third-party software, such as a software or hardware probe, to monitor the technology. If a technology doesn’t expose data, or a third-party doesn’t allow for data consumption programmatically (such as CDN logs), then it might be time to look at a replacement. You can’t have a data blackhole in your workflow instrumentation.
Data Normalization
Once the data has been collected, it has to be normalized. You can probably surmise that most workflow technology vendors are not coordinating with each other regarding data representation. They employ different fields and different values, sometimes for the same metric! So to make sense of it all, to ensure there is a relationship between the encoding data about a chunk and that same chunk in the cache, all of the data being collected should be normalized against some standardized schema. Doing so will ensure that the forest you see has all the same types of trees.
Data Storage
Of course, collecting and normalizing all this data without a place to store it doesn’t make much sense. You need a repository that is also flexible and programmatically accessible. This could be a data lake provided in a cloud operator, like Google BigQuery, supported by an additional, transient storage mechanism, like Memcache, for lightning-fast retrieval.
With The Forest in View, You Are Now a Data Ranger
With end-to-end instrumentation of all the workflow technologies, you can get down to making sense of it all. For those just getting started with this kind of approach, that will require a lot of manual connections. You will spend your time tending the forest, pruning trees, grouping them together, and relating them. That work, of course, will pay dividends in the future as your video analytics visualizations and dashboards become ever the smarter. But making manual connections between data sets within your storage isn’t scalable. Most streaming operators will look for ways to automate this through machine-learning or artificial intelligence systems. These systems, once trained, could propose connections on their own, making suggestions about the nature of a data value. For example, if your rebuffer ratio is high and your encoder is through errors, a system like this could bubble up a recommendation that one of the bitrates in the bitrate ladder is corrupt. An intelligent system might even analyze each of the bitrates and identify the one which is causing the higher rebuffer ratio.
Let the Forest Tend Itself
With a continual flow of data coming from throughout the workflow, normalizing and visualized for quick decisions, you are well on your way to taking the next step in streaming operations: automation. Edge-based, serverless processes, such as Lambda functions, could analyze results from different data sets in real-time (leveraging that machine-learning or AI layer that we mentioned previously) and take action against them based on pre-determined thresholds. For example, if viewers in a specific geographic region were having high TTFB values, the system could automatically switch to an alternate CDN. If that did not fix the problem, the system could then serve a lower bitrate, overriding the player logic with some data. You get the idea. A system like this not only provides granular, robust analysis to operations (through real-time, dynamic visualizations), but it also participates in continuous improvement and self-healing. Automation within the streaming video workflow could even get predictive by comparing real-time video analytics being collected with historical heuristics. What if the system knew that on Mondays, CDN A usually had a tough time in a certain geographic region? Rather than relying on the analysis of data to make a switch, why not automatically switch to CDN B during that time frame?
Data Enables Decisions, It Doesn’t Make Them
Don’t be the streaming provider that just sees the numbers. That’s looking just at the trees. To truly make informed business decisions that affect QoE and QoS, which in turn affect subscriber satisfaction and churn rates, you need end-to-end instrumentation. With a system that collects all the data, normalizes it, and visualizes it, you can be assured that your operations personnel can see the forest to make better, holistic decisions rather than fixing the value for a single data point.
The post Don’t Let Video Analytics Keep You From Seeing the Bigger Picture appeared first on Datazoom.