What is the Datatecture?

The Data Landscape of the Streaming Video Industry

First, the datatecture is a picture of the technology categories involved in streaming. Not all are required, depending upon the purpose of the streaming platform (i.e., education vs. direct-to-consumer), but they exist in the streaming architecture regardless.

Second, the datatecture is an inventory of the companies which offer services and technologies in those categories within the streaming video technology stack. Covering three core areas, Infrastructure, Operations, and Workflow, each of the companies categorized into various sub-groups provide many of the critical elements for a streaming platform ranging from data analytics to encoding. What’s equally important, though, is that each of the company’s products, services, or technologies contributes data to the overall picture of the streaming platform’s performance and efficiency.

Optimizing Streaming Video Through Better Observability

Right now, the industry has a challenge: how to optimize the creation, delivery, transformation, and operationalization of streaming video. Doing so requires harnessing all of the data from within the video technology stack: operations, infrastructure, and the delivery workflow. But it can be difficult to understand what data is coming from which provider. That’s where the datatecture comes in. This mapping of the different technology providers within streaming gives operators a way to shape their streaming ecosystem, through partnerships and relationships, so that they have clarity on the data that’s available to them.

How Datazoom Can Help

Once you have created the datatecture for your streaming platform, Datazoom can help you get the data from the individual vendors into the tools you need. By working with Datazoom, you don’t have to worry about building or maintaining individual data pipes with specific components. Datazoom enables you to do that quickly and easily through an intuitive interface of Collectors and Connectors. Finally, all of the data that flows through Datazoom to your visualization tools or datalake, can be normalized against Datazoom’s data dictionary to provide the consistency you need for true observability.

You need to understand who does what and the data each provider makes available to have true clarity.

Clarity is critical to streaming success
Without an understanding of the role each element in the streaming workflow plays in the streaming data, clarity can never be achieved.
After clarity comes planning
How will you organize your datatecture? What vendors will you use? How will you employ the data those solutions make available?
Create a picture of your workflow with your datatecture
Don't just rely on our datatecture. Use it to build your own. Once you have clarity and a plan, pick and implement the technologies that will help you provide a better viewing experience.


This part of the datatecture deals with operating and monitoring the components and infrastructure involved in streaming.


Some common operations elements include configuration management, monitoring, and analytics.


The part of the datatecture deals with the underlying network architecture and other software or equipment needed to facilitate delivery.


Some common infrastructure elements include containers, object storage, and data warehouses.


This part of the datatecture addresses the components and technologies installed in the infrastructure and monitored through operations.


Some common workflow components include encoding, security, transformation, delivery, and playback.

Are Your Missing?

Should your company logo be listed in a datatecture category? Don't worry! That's an easy fix. Just click on this box to send us a message.

Are You Missing?

We want to include you, trust us. The industry datatecture is only as good as the number of included companies.

Stay Up-to-Date on The Streaming Video Datatecture!


The Streaming Video Datatecture and microsite provide free and educational compilations of all of the companies and services involved in the streaming video industry. It provides a deep look into the ecosystem and layout, and explains the technologies that power video streaming. Created by Datazoom, the enterprise video data platform technology company, the goal is to help content owners, technology and service providers, investors, and other stakeholders better understand the data, systems, and leveled architecture that come together to create the data fabric of streaming video.

Contact Information

Recent datazoom news

Datazoom The Video Data Platform

  • Understanding the Datatecture Part 4: Workflow Deep-Dive
    by Jason Thibeault

    In Part four of this series, we dig into some of the deeper layers of the Streaming Video Datatecture in the Workflow category, defining many of the individual sub-categories and explaining their purpose in the broader workflow.   As we covered in the first post of the series, the Datatecture is governed by three main … Understanding the Datatecture Part 4: Workflow Deep-Dive Read More » The post Understanding the Datatecture Part 4: Workflow Deep-Dive appeared first on Datazoom.

Copyright 2021. Datazoom. Inc. All trademarks and logos are owned by their respective companies. Inclusion of a logo in the datatecture is not an endorsement by Datazoom, Inc. nor representative of any relationship between that company and Datazoom, Inc.
Scroll to Top