2020 has been a crazy year. COVID has thrown so much into doubt. We’ve even heard from some in the industry that the MovieLabs 2030 Vision “is already here” and that companies are now operating “in the cloud” because of COVID. Apparently, 2020 is the new 2030? Well, I hate to be the bearer of bad news but …not so fast.
The 2030 Vision requires a strong cloud foundation, but that is only part of the picture – the goal of a cloud foundation is to provide an efficient base for new and innovative technologies, improved workflows and seamless security. And even with COVID driving acceptance of remote work by creatives, we’re a long way from deployment of all the new technologies and automated workflows that will deliver the full benefits of the 2030 Vision.
Just how ready are cloud infrastructures for the 2030 Vision?
Despite the industry’s robust and innovative response to COVID, we’re still at the beginning of our journey. Establishing where we are now and how far we have to go is an important early step in reaching the goals of the 2030 Vision. Cloud infrastructure components and core technologies are at different maturity levels with respect to cloud readiness. To make real progress as an industry, we need to understand where we are and identify technology gaps so that the industry can focus on innovation to fill those gaps.
How to objectively assess production workflows?
I’d like to share how we think about cloud “readiness” and how we propose to apply objective criteria to assessments of different workflow elements. Future blogs on readiness will focus on key takeaways from some of our 2020 assessments, as well as the opportunities identified to close some of the gaps. Ultimately, we hope to repeat these readiness assessments over time to track our industry’s progress towards full implementation of the 2030 Vision.
What are the important dimensions of cloud readiness?
Media production workflows are too complex and far too nuanced to assess with a one-dimensional, monolithic approach. It would be impossible and unwise to describe production readiness as a whole with simple labels such as ‘ready’ or ‘not ready’. Given the diverse range of production activities and requirements, our cloud readiness reviews divide production workflows into a set of common tasks (e.g., editing, color grading, sound mixing, etc.) and then assess each task across several key dimensions, including:
1. Storage – Is the task ready to take advantage of multiple types of storage available in private, public and hybrid clouds?
2. Network – What are the network requirements for completing the task in the cloud? Can the requirements be met for this task?
3. Compute – What are the compute requirements? Can the requirements be delivered easily across private, public and hybrid clouds?
4. Software – Are the key software applications used to perform the task optimally designed for cloud use cases and integration with other applications in cloud-ready workflows?
By breaking cloud readiness into different dimensions, we can extract the nuances that apply to different components of the task, e.g., a task such as color grading via the cloud may be technically ready from a storage and compute performance point of view, but network performance may not be sufficient to perform the task in real-time.
Real-world limits of objective assessments
Before discussing specific assessments for these dimensions, let me expand briefly on the notion of subjectivity versus objectivity. Our goal is to provide objective, repeatable and measurable results for our cloud assessments. But we also recognize that the cloud viability of a workflow always involves some element of subjectivity, especially the subjective experience of creatives and other production workers themselves. Take for example round trip latency for remote desktops. We can measure objectively that a particular system over an identified connection provides latency of 53 milliseconds, but whether that system is usable for a particular artist is a subjective determination based on the experience and disposition of the artist. Sensitivity to latency, general or prior experience with remote desktops, the task at hand, and even the artist’s mood at the time—could all affect whether 53ms is good enough for that artist.
The rating scheme
With that qualifier in mind, here are the readiness ratings which we use for each of the four technical dimensions above to assess cloud readiness for each workflow task (as objectively as we can):
- Not Ready – This task is beyond the current capabilities of the cloud because it can’t be achieved at the current quality or efficiency levels of an on-prem equivalent. Or the task contains either too many ‘Not Ready’ sub-components or one critical Not Ready sub-component which affects its overall readiness (e.g., the infrastructure is cloud-ready, but no software is commercially available, making the entire task slip to Not Ready).
- Ready with Caveats – The task or most of its components can be achieved in the cloud today, albeit with tweaks for tools or workflows that are noted in the assessment.
- Ready – The task is ready for mass deployment in the cloud and will allow performance that matches or exceeds the equivalent performance with on-prem infrastructure or processes.
It’s worth noting that a ‘Not Ready’ or ‘Ready with Caveats’ rating should not be taken as a negative assessment. Our goal is to identify elements of a task where there are gaps in 2030 Vision readiness. We don’t expect all elements to be green. In fact, we would be surprised if very many elements of the technology landscape in 2020 were truly ready for the 2030 Vision. When an element is listed as Not Ready, or Ready with Caveats, that rating simply becomes a guide for identifying missing components or workflow shortcomings and for assessing the work that we, as an industry, must do to address the gaps.
Expect more readiness blogs over the next months
In the next few months expect to see more blog posts on cloud readiness as we break down workflow tasks and share the gaps we have identified—gaps which perhaps would be described better as ‘development opportunities’. In essence, that is what they are—opportunities for innovation and for companies to focus resources on solutions necessary to achieve the full benefits of the 2030 Vision.
Be sure to check the 2030 Blog for updates on this workstream—or follow MovieLabs on LinkedIn and Twitter for new readiness assessment alerts.