Streamlining the Production-Distribution Pipeline

Bookmark and Share

Mon, 11/02/2020 - 10:46 -- Nick Dager

Technicolor LondonTechnologist Jack Watts began his career in post-production in the Republic of Ireland. In 2013 he joined Technicolor in London where he was head of technology and development. He moved to Deluxe in London is 2015 as part of the joint venture between the two digital cinema business units at both companies. There he primarily focused on mastering solutions for digital cinema, and then eventually, the interoperable master format. There he acted as an internal consultant to the technical operations, client services and business development teams. Externally, there was also the need to consult the business’s premium studio clients.

Day to day, he looked after the DevOps of globally deployed mastering tools across all Deluxe Technicolor digital cinema facilities from London to Sydney ensuring a consistent and uniform operational process. The role also required active participation within the industry including speaking at events and participating in key industry communities and standards development organizations such as the Society of Motion Picture and Television Engineers. Today he runs a London-based consultancy and software development business called Trench Digital. Our conversation started there.

Digital Cinema Report: Tell me about Trench Digital’s clients and the services you provide for them.

Jack Watts: My common client base is either service providers or servicing vendors. [I work] to deliver or process the plethora of titles from content owners for theatrical distribution or over the top platforms. Often, it involves a methodically structured training curriculum covering advanced formats such as digital cinema packages and/or IMF implementation and mastering, right down to the underlying computer science fundamentals. The output of this can then be mapped to the business’s ongoing strategic initiative by working with the internal development teams and evaluating third party products to ensure that the most efficient pipeline encompassing all of the business’s immediate requirements are met guaranteeing the delivery of either a viable return on their investment or enabling substantial growth to their operation.

Jack WattsLengthier projects are generally software development orientated where I will design and oversee the development and deployment of a product, which could be a media-based processing toolchain or a service-based platform. Design can be completed independently or part of a team.

DCR: According to your website, your specialty is the various forms of the DCP and the IMF. What are some of the common misconceptions regarding those two bedrocks of the motion picture workflow?

JW:  The main misconception that I see is the crossover of the two where people have assumed that constraints in digital cinema can be applied in IMF and vice versa. There are common terms and data structures between the two but ultimately, although the philosophy of the framework they each employ is quite similar, they are both separate and unique and should be treated as such. It does not help that people still refer to IMF as DCP+. IMF is mature enough now that it does not need a reference to digital cinema as an example to explain it. In fact, there are initiatives in play within the community right now to mitigate such misconceptions. Another grievance I have is with how IMF has and still is (by some outfits) marketed as the solution to all your problems. IMF will aid you in streamlining your versioning workflows and making the downstream processing chain more efficient. It also has a lot of other interesting and conceptual applications such as archive, editorial and even acquisition. In other words, IMF sits within your pipeline, your pipeline does not sit within IMF.

True IMF adoption can be sluggish. And by true I mean utilizing the componentization aspect of the framework. It can be frustrating to wrap your head around the concept of a non-linear componentized workflow, especially when you come from a linear file or tape-based background. Naturally, people are averse to change, simply because you do not know what you don’t know, and people can be afraid of what they don’t know. It is for this reason that I structure my training at three different tiers, so that people do not get overwhelmed and are able to apply what they learn to their respective responsibility, be it business development, implementation or operations.

DCR: As I understand it, you were involved in the development of the SMPTE DCP standard. Is that correct and, if so, what was your role in that process?

Deluxe LondonJW:  Currently, I serve as the co-chair to 21DC, the technology committee where the SMPTE DCP standards are managed. In addition to the usual standards development contributions by participating in 21DC, there is also extensive testing done at events called plugfests where representatives of key manufacturers assess the creation, exchange and playback of content ensuring that the correct behavior is observed by all participants. This guarantees interoperability in the field when content is distributed. Recently, I served as a proponent and co-edited RDD52 with an industry colleague of mine. RDD52 defines the application profile Bv2.1 for the SMPTE DCP and contains constraints based on real world practical applications of the DCP.

[Editor’s Note: Here is a link to RDD52 https://ieeexplore.ieee.org/document/9161348.]

In addition to 21DC participation and the RDD publication, I have been heavily involved in the international SMPTE DCP roll out, which is the transition from Interop to SMPTE. That presented numerous challenges along the way and resulted in the web portal www.smptedcp.com being created to act as an information resource for all things SMPTE DCP. I also participate extensively in industry communities such as ISDCF and EDCF. It is in these communities where a lot of work gets completed before being brought to SMPTE.

DCR: As long as people shoot, post, distribute and exhibit motion pictures, there will be workflow challenges across that entire process. What are some of the issues that still need to be addressed to improve workflow?

Trench DigitalJW: Every facility, even if they are offering identical services and communicating common data are operating and communicating differently. Initially one round of development is completed to set up the relevant negotiations between systems, departments, and primary clients. These negotiations are often facilitated using APIs. This process can sometimes take weeks, if not months, and often results in a hefty bill to reflect the investment. The problem here is that such work is to satisfy a single environment and is in no way scalable or customizable to allow it to be repurposed. This is often down to the limited scope of the APIs one is working with and the fallout results in businesses reverting to manual and ambiguous forms of communications such as email, instant messaging, and telephone conversations.

There needs to be a uniform, extensible and open solution for the exchange of media data between all stages of the production pipeline. This could be achieved by defining a common industry vernacular where global collaboration is actively encouraged, and the terminology used across the board is normalized. It should be built on top of a standardized backbone of extensive documentation derived from proven industry use cases where common tool chains are used to mitigate any implementation uncertainty.

DCR: How long have you been a SMPTE member?

SMPTEJW: I have been a general SMPTE member since 2009, I then graduated to standards participation when I joined Technicolor in 2013.

DCR: You’ve given presentations at SMPTE conferences. When was that and what topics and issues did you address?

JW: I have given presentations at SMPTE conferences and trade shows where SMPTE is either affiliated with or has a presence. Generally, the topic and depth of that topic depends on the audience that you are looking to target. When speaking about the SMPTE DCP for example, it’s more of a report with the global status on where things currently are in the distribution space. There is very little traction with wider audiences when talking about the technology as the engineering component to such subjects is commonly limited to a very specific niche group of people as such if I was to focus on the deeper aspects of the technology, I would lose the majority of the audience.

That said, when talking about how one can use such technology, I intentionally make a point of addressing real-world issues and bottlenecks in the content production supply chain by bringing achievable solutions to the table. In other words, I make it relatable. Case in point: my co-authored paper Automated VFX Pipeline for the Purpose of Servicing the Creation of Downstream Masters, which I presented at SMPTE’s 2019 technical conference. This paper addressed some key bottlenecks and repetitive laborious tasks versioning facilities’ need to perform on each title in the realm of graphical localization. This tapped into an editorial application of IMF, which I alluded to earlier on. As with all my engineering based presentations, I make a point of acknowledging the fact that everything costs money so being able to attribute all aspects of an engineering proposal to a base pound(£) value is key for any financially focused or C level management that may be consuming the presentation content.

DCR: For you, what is the value of a SMPTE membership?

In a single phrase, the community. The abundance of knowledge and understanding one can gain by simply listening to the combined experience and knowledge of the SMPTE community is astounding. The global cultural differences and use cases really open your mind to the magnitude and varying perspectives of technology use and design. The off-the-wall conversations with community members can spur a lightbulb moment for industry innovation. It is a consistent eye-opening experience.

Trench Digital http://trenchdigital.net