Jay Yogeshwar

The Changing face of Media Archive Workflows

Blog Post created by Jay Yogeshwar on Apr 20, 2017

With exponential growth in content from Petabytes to Zettabytes, traditional ways to deal with content archives are proving to be onerous. Old Hierarchical Storage Management or HSM systems have now evolved into virtualized hybrid cloud environments that are built on standard interfaces. The future holds still more efficient techniques to manage media workflows – these take advantage of content intelligence gained from combing through metadata, indexing them, cataloging and searching against them. They also harness the power of data intelligence or Analytics to gain intimate understanding and control of the data and more importantly, the business of media

 

I. The Tried and Tested

 

In the good old days of Tape and Optical based archives, HSM was king. The media repositories were in the order of a fraction of a Petabyte to a few Petabytes. Workflow engineers came up with elaborate plans for how data would move between production, near-line and archive tiers based on weeklong play lists prepared ahead of time. Bespoke software and bespoke appliances followed strict handshake and transfer protocols that understood the formats championed by leading broadcast video vendors. I was complicit myself, and made a living creating best practices for storing, accessing and playing video files and associated streams. Standards like Material Exchange Format (MXF) were a major breakthrough in the packaging of content and understanding associated metadata

 

II. We have come far

 

Several developments conspired to change archive workflows to behave like loosely coupled systems and Services based. Time to air was critical, and this meant just in time playlists and content packaging. File based processing gave way to stream based processing and the ‘While’ scenario became the de-facto way that workflows worked. One did not wait for the entire video to ingest before subjecting it to a slew of post-processing elements. One could begin working with video segments for color correction, audio sweetening or non-linear editing. Graphic overlays did not have to be burnt in.

 

New standards began to surface that began the process of abstracting the various processing tasks. Capture, transform, transfer and repository tasks that were strictly controlled with vendor specific Application Programmer’s Interface (API) are now accessed via Web Services API calls that makes it machine and vendor independent. This has freed up many Broadcasters from creating workflow orchestration that is agile and able to use plug and play components. Increasingly Media Asset Management (MAM) vendors are now incorporating these techniques into their tool set and broadening their scope to being Enterprise Content Management systems.

 

III. Where are we headed?

 

The holy grail for content management and archives is location and application independence. Teams of collaborators need to work together and have the same repository view and access to same content. File Sync and Share with Global Access Topologies are must have feature associated with archive workflows. Mobility and native Cloud capabilities are the new normal. The secret sauce to such capabilities is pervasive content intelligence – the ability to index content ingested anywhere and then to create a global catalog that allows for efficient search and access. When combined with data intelligence or the ability to analyze and correlate disparate data sources, content intelligence takes new meaning. Of course, this is the starting point for a new wave of creativity that stems from unfettered content and archive workflows.

Outcomes