DevOps has been a revolution in software development. To support agile methodologies, in which developers can incrementally release features that are tested by users in real-time, developers needed to have an increasingly larger role in operationally supporting their code. Waiting for operations to make configuration changes to environments or push out new code didn’t support the new scrum worldview. So the role of DevOps was born and how software was created and released never looked the same.
But the success of this new role in the software ecosystem, both programmer and operations, depends on data which is especially relevant in streaming video. New features or application versions can negatively impact Quality of Service (QoS) or Quality of Experience (QoE) and ultimately influence whether subscribers keep paying.
Not All Data is Created Equal
There are several kinds of data which can influence software development:
But when it comes to streaming video, there is another kind of data which is far more valuable in understanding the impact of new features or software on the end-user experience: experience data.
The Data of Experience
In streaming video, the software experience is concentrated in the player which can reside on a variety of client devices ranging from computers to smartphones to boxes that plug directly into the television. This player is full of additional software, encapsulated in Software Development Kits (SDKs), that provide both functionality and data collection. Often supplied by third-parties, these SDKs are installed directly within the player. This software architecture is well understood by DevOps. Additional functionality within the player can easily be encapsulated in SDKs and deployed quickly.
But these SDKs can also provide that crucial experience data. Many of the components loaded by the video player also send back telemetry data which is critical for understanding overall QoS and QoE. This data can tell operations, or DevOps, how critical KPIs such as rebuffer ratio are performing which provides information about the overall user experience. Of course, that is subjective, as each user has a different threshold for certain metrics but the data ultimately provides a general indicator of the positive or negative impact on the user experience.
A System for Collecting and Normalizing Data
Rather than developing functionality to collect that data into new SDKs or other player functionality, DevOps needs access to technology that can be easily integrated into existing systems. When a streaming operator has already embraced the idea of a video data platform, the mechanisms by which data is collected from components within the streaming technology stack, such as a player, are well defined. DevOps needs access to such a system so that approved and proven approaches can be employed as part of the development process to ensure user feedback, performance metrics, and experience data can be collected.
Only collecting data isn’t the only consideration. For the data to be useful to DevOps, and the rest of operations, it must be normalized first. When there’s a video data platform, this is easily accomplished through a standardized schema that can take data elements from a variety of SDK sources and process them before dropping the data into a storage solution such as a cloud provider, like Google BigQuery.
Without the Right Tools…
DevOps is the way software is created now: programmers with operational knowledge and access to the tools and systems for deployment. But to be truly effective, their efforts must be tied into larger systems like video data platforms. In this way, the software they create, whether new versions, new applications, or new features, can also be instrumented to collect the data necessary to evaluate the impact on both quality of service and quality of experience.
The post Why Video Data Telemetry is Critical to DevOps Success (And How to Get It) appeared first on Datazoom.