Test Pilots

Bookmark and Share

Thu, 01/23/2014 - 15:20 -- James Mathers

By James Mathers

DP James Mathers, left, on the set of the Tribe of the Wild pilot.The industry doesn’t do as many pilots as in years past, which is a shame, because they can be a valuable format for testing not just story ideas and actors but new technology. In my role covering new cinema technology for the Digital Cinema Society I am often offered gear to test and evaluate, everything from new lighting units and cameras to various digital cinema recorders. Since I need to keep busy earning my living as a director of photography, I honestly don’t have time to run scientific tests, so I prefer instead to use the gear in real world productions I’m hired to work on. You could call me a technology test pilot. I get a better sense of how these tools perform where the rubber hits the road, and it’s a good way to help out the productions I’m working on, pulling in items they might not otherwise be able to afford. The perfect opportunity to put many of these items through their paces came up recently when I was asked to shoot a feature length pilot for a new action/adventure show aimed at young adults entitled, Tribe of the Wild.

Tribe is the story of five high school friends who are transported to a mysterious and unfamiliar world, a parallel reality where they are drawn into an ongoing conflict between competing bands of strange alien creatures. The show’s creator/director, Shuki Levy and producer, Ronnie Hadar were the same team I worked with way back in 1985 on the concept piece of what would eventually become the extremely successful Power Rangers franchise.

Although my friends left the series many years ago, Power Rangers is still in production and continues to generate huge profits, so expectations for this new endeavor were quite high. Based on Levy and Hadar’s track record creating hit programs, Relativity Media offered to distribute the new series, but first, my friends would need to produce an impressive pilot. Like most productions these days, the budget was extremely thin for what they were hoping to achieve.

The entire project came together only a few months ago, but needed to be completed by the end of the year. The fantasy world where much of the story takes place is based in a rain forest, not the easiest of environments for a relatively low budget production. It got me thinking that maybe we should shoot these jungle “exteriors” as virtual sets. The script also called for a plethora of strange creatures that could more easily be achieved with CGI, so a virtual approach seemed to make the most sense. I was able to recommend another old friend, Ron Thornton, who I had worked for on the ABC series Hypernauts, which was a Sci Fi Space Adventure that also used a lot of virtual sets and CGI. Ron had also been responsible for coming up with the VFX for such hit shows as Babylon 5 and several of the Star Trek TV series sequels, among many other prominent feature and TV credits.

Tribe of the Wild is from the creators of the Power Rangers.I make it my business to keep on top of new technology, and right about the time this project landed in my lap, I was attending the SIGGRAPH convention, which features the very latest in computer graphics. I had been following a little company called Lightcraft Technology for several years, a system originally designed to preview green and blue screen composites live on the set without the need to lock the camera. They were recently awarded an Emmy for Outstanding Achievement in Engineering Development for the Previzion Virtual Studio System.

Previzion is the brainchild of a longtime DCS member and MIT graduate, Eliot Mack, and works by basically tying in the camera’s metadata, which includes precise lens, geometric orientation, and spacial location details to a computer that moves the virtual background in relation to the live action. Much of this data is collected by way of a sensor, (a small black and white camera), positioned on top of the principal camera that reads a series of markers placed equidistant on the ceiling across the set and pre-mapped with very precise surveying tools.

All of this data is fed to the Lightcraft computer to tell it frame by frame where the camera is positioned in space including tilt, pan, and height together with lens and camera metadata. Having this ability for the virtual background to change in relation to the camera is extremely freeing compared to standard green screen type shooting where the camera either has to be locked and the data manually recorded for later compositing, motion controlled, or tracking markers have to actually be placed in the shot, which need to be painstakingly removed in post.

With Lightcraft, I was able to zoom, focus, and do any manner of live, physical camera movement, even handheld, and the background followed in lockstep. Since I like to keep the camera mobile and frequently use a jib arm, this approach was very appealing.

The decision to shoot on a set was a function of the low budget.My Fujinon Cabrio 19-90mm zoom turned out to be the perfect lensing choice for the job. It’s electronic and was able to transmit focal length, f-stop, and focus in real time directly to the computer. It also afforded a broad focal range, and since each lens change would entail significant re-calibration of the system, it was really good to have one lens that could achieve both wide shots and close-ups.

Heretofore, Lightcraft has mostly been sold as a previz system, but with recent advancements in GPUs, it now seemed possible to accomplish at least some of our compositing live on the set. Armed with a super fast Nvidia Quadro K6000 processor powering the computer, we would simply record the composite image, (live camera and virtual background,) as it came out of the Lightcraft system. We knew it wouldn’t work for all our shots, with work such as wire removal, and other tweaks needing special post-production TLC; but even if it knocked out only a portion of the shots on such a VFX laden show, it would be a huge savings of post manpower. And even if shots did need to be tweaked and rendered in post, having the frame-by-frame referenced metadata would make relatively simple work of it.

While there are many cameras I could have chosen, and some may have been better, I own Red Epics and there is a comfort factor there. With so much new technology to deal with, having a familiar camera kept the stress level down; and together with cinematographer, Conrad Hunziker, who would serve as my DIT and 2nd Unit DP on this project, we’ve probably shot more Red footage than just about anyone.

As creator of the R3D Data Manager and Double Data software programs, Conrad practically wrote the book on Red data management. Red also offered the option to record at higher resolutions allowing me to shoot the background plates at 5K which I had the pleasure of shooting while visiting the most beautiful spots in Hawaii; (even low budget sometimes has it perks.)

The main drawback of using the Red was that we were trying to use at least some of the live output for final compositing on the set. With Red, the live output is a lesser HD signal that is meant for monitoring only. However, it is in line with the signal quality through the Lightcraft system, and even though the show would be finished in HD, shooting at a higher resolution would allow some measure of future proofing in order to be better prepared for the coming of 4K distribution. It wouldn’t be cheap, but this would give the Producers the option of someday being able to go back and refinish at a higher resolution. Of course, locking the composited images together live on the set greatly limits the ability for final color correction in post and only allows overall adjustments to both sides of the composite. To give us a fighting chance, Conrad used Blackmagic Resolve to apply a Look between the camera and the Previzion system, as well as to the R3D transcodes for post compositing, and worked with Lightcraft’s internal correction tools to get the best match possible.

Deciding how to handle the various feeds was a technological challenge.Another challenge was needing to figure out how to record these various discreet feeds including the output straight out of the camera, the composited image with Metadata out of the Lightcraft computer, as well as the Alpha channel with the background as a separate element. I turned to my friends at Sound Devices and AJA, who were happy to have me test their latest recorders in this way.

I’m a big fan of the Sound Devices PIX240i, which we used for two of the channels. It records directly to ProRes 4:4:4:4, which is easy for post to work with and a great choice for chroma compositing. It also has a built-in Ambient time code and sync generator to help keep the whole system locked, not to mention eight separate channels of audio.

We were also graciously loaned the use of the AJA Ki Pro Quad, which was specifically requested by our Lightcraft VFX advisor, Ron Fischer. Ron explained the value of the Ki Pro Quad this way:

"The relevant feature is faithful preservation of SMPTE 315M camera and lens tracking metadata, which nearly no other recorder at its level does. This feature allows us to replay the camera feed with embedded tracking, and treat it as live, effecting an online recomp of virtual scenes. This keeps a large percentage of simple effects shots from needing post treatment, leaving time and effort for the biggest, most difficult storytelling shots."

With the rain forest setting of the virtual background, and the need to blend in green foliage as set dressing, we decided to go with a blue instead of the more popular green screen background. This created some additional challenges to the production; the first was simply finding a large enough blue screen stage. There are dozens of freestanding green screen stages of various sizes in the Los Angeles area, but we had trouble finding anything with blue. It was decided to create our own blue screen stage at our facility of choice, Santa Clarita Studios. As an unsold pilot, this was a one-off production, and it didn’t make economic sense to build and paint a large three wall coved cyc. Instead, we rented cloth, but were limited to only a 40 foot by 50 foot background, (50 feet wide, 25 feet high, and 15 feet forward on the floor.)

Working with the blue cyc posed several creative and technical challenges.There was a practical dirt floor in front of the background, which served us well, because anywhere actors make physical contact with the background is where the VFX blending starts to reveal itself and where compositing becomes tricky. However, having only a 50-foot wide background without a real cove was limiting for some of our bigger shots. I should mention here that the Lightcraft Previzion system does allow you to see the intended background into the “garbage matte” area. In other words, you can actually move the camera to shoot off of the blue screen and see the intended CGI background. You’re golden as long as you don’t need to see any actors or set pieces separated in front of it. This helped, but with some of our bigger crowd shots it was tricky to keep blue behind all the subjects, and such shots will also require post tweaking.

Another issue with blue was the fact that the Lightcraft system is more effective with green. Although blue screen is possible, it taxes the system for real time compositing, so it is recommended to use a darker shade of blue. They also recommend lighting this dark blue background up to your Key exposure level; (as opposed to using green where my normal rule of thumb is to expose the background at one stop under Key.) Of course, such blue requires significantly more light than the very reflective digital green.

With cameras becoming more sensitive these days, this might seem a non-issue, but with compositing, you really need to maintain solid exposure to avoid noise, and it also helps to have a healthy amount of depth of field. Since I was aiming to set the lens stop at least at F4, and rating the camera at 640, this would mean maintaining upwards of 35-foot candles hitting the background. Since I have found the MX Sensor in Red cameras to be partial to blue light, I have become accustomed to balancing for daylight, which is what I decided to do here. (I should note that the shade of blue we ended up with that you see in the stills was actually lighter than intended which resulted from varying naming conventions such as chroma blue versus digital blue between different manufacturers.)

For my money, no matter what the color of the light, color or size of the background, Kino Flos are the best solution for achieving the soft and even illumination necessary for compositing. In this case, it would just require more of them than the budget allowed. The line producer’s screams when he first saw my lighting order still haunt me; it included several dozen Kino Image 87s and Blanket Lites just to light the background.

(For those not familiar with the term, a Blanket Lite consists of 16 6-foot Kino tubes rigged into a 6x6 speed rail frame assembly with options to go through a variety of diffusion cloths. They are easy to hang for overhead fill, or put on a stand as a source of soft and even light.)

Luckily Santa Clarita Studios has a vast store of Kino Flos, and its director of operations, Ron Dahlquist took a special interest in the project, even offering to become my rigging gaffer. With a 44 year track record, Ron is one of the most experienced set lighting authorities in the business and for many years he ran the lighting department at Paramount, as well as having been a partner in Keylite, which in it’s day, (‘70s and ‘80s), was one of the biggest G&E rental facilities in the business. In fact, Ron was instrumental in the development of the HMI, and along with Wally Mills, another Lighting Technology Pioneer, currently manufactures and sells HMIs through their company, Dadco. Ron’s help supervising the lighting of the blue screen was a great blessing since our production schedule dictated that the stage needed to be lit while the main unit was out on location.

The pilot was delivered successfully and the producers await word whether the series will move forward.Ron was also eager for me to try some of his new Dadco HMI units, which turned out to be quite impressive. The big gun they provided for our location work was a 24K called The Challenger. 

Their new Starburst line is particularly useful when you don’t have a big lighting package because each unit is so versatile. Switch globes and use one ballast and head combo as either a 2.5K or 4K, another set as a 1.2K or 1.8K. Of course, the 1.8K heads were particularly handy on location days because they can be run off a single 20 amp household circuit. Knowing that my lighting needs would greatly exceed my budget, I approached some companies I know and was also offered several LED lights to try out for evaluation.

These included new models from Litepanels, Nila, Zylight and from a company I had never heard of before called AAdynTech; we even had use of a new Kino Flo LED product known as Celeb.

AAdynTech provided several of their Eco Punch Lights which have as much brute light output as a 2.5K HMI yet use only 5 amps of power, run cool to the touch, and are fully dimmable.

We also had one of their Jab Hurricane units, which are a little smaller, but still have an impressive output and draw so little power they can run off of a couple of camera batteries for several hours at a time. The Jab Hurricane also had the bonus of being somewhat weather proof, which came in handy for a scene we had in the rain; plus it offered a great feature of simulating lightening which we used to great effect.

Litepanels was kind enough to provide one of their new Sola 12 Fresnels, which I fell in love with. Finally we have a light with all the advantages that LEDs have to offer combined with the beautiful focusing abilities of a large Fresnel lens. The output of the Sola 12 is similar to a Studio 2K, yet daylight balanced; (they also have an Inca version which is Tungsten.) The huge spot to flood ratio really helps in shaping the light. I can't show you an image since it is not released yet, but we also had use of the latest version of the Litepanels Hilio, which has significant refinements since I first used it a couple of years ago. Drawing just 115W, it now packs more punch than a 1.2 HMI Par, yet like all their lights, stays cool and is completely dimmable.

My friends at Nila provided the most powerful LED I have yet to see; it’s called the Nila SL and draws only 500 watts, but I would say it serves as a suitable replacement for up to a 4K HMI Par. We also had the SL’s little brother, known as Boxer; it’s much smaller, lighter, and drawing only about 200 watts, it can replace a 1.2 HMI. Unlike HMI lights, these LED units emit no UV or IR, have a long life, (20,000+ hours on a set of LEDs,) are DMX controllable, and completely dimmable.

A unit known as the F8 from Zylight helped to round out my extensive LED package. The F8 is small, lightweight (about 10 pounds,) and features an 8-inch Fresnel lens. It’s very portable, water-resistant, rugged and comes with an attached Anton Bauer Gold Mount for ease of running it off DC power. I even had one of a new line of LED lights from Kino Flo called the Celeb that allows you to incrementally dial in color temperature from 2700-5500 Kelvin.

You can probably tell that I really like LED lighting, and the technology is improving all the time, but I must caution about the challenge of maintaining consistent color temperature when mixing units from so many manufacturers. The more established manufacturers, (Litepanels, Nila, etc.) have a much better handle on these issues, and I don’t know whether it is because they have just been working at it longer, or that they have the ability to cherry pick the LEDs they choose for their units. In any case, although they are pretty well matched within a manufacturers line-up, they vary widely between brands. There are gels, which can help, and some manufacturers are equipping their kits with the required fixes, but that is not the ideal solution.

These problems arise to varying degrees from spikes in green or magenta that can be very hard to see with the eye, (or even a standard color meter,) but that can show up on camera. A source might not even read exactly the same from one actor, wardrobe, or piece of set dressing to the next. Add to that the fact that I was trying to lock my foreground and background images together on the set, and you can see that I will have some tricky color timing issues ahead in post.

Much work is currently being done is this area, not only by the manufacturers, but also by the Academy of Motion Picture Arts and Sciences who have established a committee to study the subject. With the promise of so many benefits offered by LED technology, and the ever-increasing speed of innovation, I’m sure any such issues will be short-lived. In the meantime, HMI, Tungsten, and Fluorescent instruments still have their place, and I would also advise against buying cheap knockoffs. Any of the minor issues I encountered using the highest quality products available would surly be magnified with less professional instruments. With LEDs, the old adage seems to apply: you get what you pay for.

As I said, although the industry doesn’t do so many pilots anymore, they can be a valuable format for testing. I don’t know if I’ll end up working on it, should the pilot go to series, but from lighting to the latest in VFX and digital recorders, this project has given me a chance to try a lot of new technology which will inform my choices going forward. The state of the art is so far advanced from what it was when I first worked for these producers almost three decades ago, and it just keeps getting better. Who knows, perhaps another Kids TV dynasty will have been created; I’m still waiting for them to make a DP/test pilot action figure.

The behind-the-scenes photos were all taken by cinematographer Martin Kobylarz.

James Mathers is a working cinematographer and is the co-founder and president of the Digital Cinema Society.