The workflow and software pipelines that have come to define Animated and Visual Effects (photoreal) feature production are traditionally divergent, but as virtual production sweeps the film industry, the distinctions between them are disappearing.
Whether working on a CG animated feature or working on the visual effects (VFX) for a photoreal feature, the design methodologies, development workflow, and software tools are almost identical during pre-production. Artists create 2D artwork using tools like Photoshop to flesh out characters, environments, and events in the film. Sometimes 3D artwork is created using tools like Zbrush, Dpaint, Modo and 3D-Coat. 2D storyboards or beatboards are drawn to develop the action and are often edited into a boardomatic to help work out timing issues and tune the preliminary running length of the film.
About the author
Lindy De Quattro is VFX Supervisor at MPC Film
Usually, a 2D animatic or 3D previsualization is used to conceptualize complex action sequences, choreography, unique environments and other challenging post-production enhancements for specific scenes in the film. 3D previz work is often done in animation packages like Maya, Blender, Cinema 4D, or more likely these days…via a game engine like Unity or UE4. Research and Development is begun on any new technology that needs to be created to execute the work on the film. Hero shots or short proof-of-concept tests are put forward by potential vendors to assure the studio production that the planned work can be achieved at the desired quality level. All of this prep work is approached in essentially the same way for both animated and VFX feature film production.
Historically, with pre-production complete, it was at this point that VFX feature production would deviate from the CG work done for an animated feature. A VFX feature begins with location and stage plate photography, while the workflow for an animated feature is much more fluid. The animated feature would move on to actual production beginning with asset building, look development, and animation, but they would not be locked into their earlier creative decisions by an expensive plate shoot (a plate in our industry is the live action footage you apply your VFX to).
Once principal photography begins for a VFX feature, you can’t easily revisit pre-production decisions without a big hit to both time and budget. Sets have been built, locations selected, costumes sewn, and equipment secured. Many of the creative decisions for a feature film are already made by the end of plate photography, and those decisions are not made in the CG department.
Several department heads, all working under the umbrella of the film’s director, make creative decisions that determine the overall look and feel of the film. The Director of Photography (DP) controls and determines the overall lighting, lensing, and photography of the film. The Production Designer oversees the appearance and construction of the sets and the set dressing. The actors obviously author their own performances. The other department heads including costume, hair, makeup, special effects, and stunts all have a major creative voice and contribution to the overall look and feel of the film, and none of those decisions involve much CG.
In fact most of those department heads are making creative decisions while having no idea exactly where the CG components will be placed or how they will eventually look. While the VFX supervisor, who is responsible for overseeing all the digital work in a feature film, would be collaborating with all of these people during filming, they don’t have direct control over any of those people or departments. Instead, their task is to help guide the rest of the filmmakers to account for the CG elements that will be coming later, and to carefully document all of the decisions made during plate photography so that they can then photo-realistically integrate the CG into those plates during post-production.
In a film with extensive VFX work, oftentimes the filmmakers are working with characters, events, and environments that they cannot see. The DP is lighting and framing for things that are not there. Actors are performing in front of greenscreens to situations and sometimes other characters which they cannot see. The director is shooting events that are only in their imagination. To aid this process, there are many tools like SimulCam, Ncam, and iPano, which can help the filmmakers visualize CG components and locations in real time. While these tools help the filmmakers to visualize the final shot, they do not allow for an iterative process where they can easily modify the CG components while filming. Furthermore, once plate photography is over, it’s very difficult to make changes to the plates and to the decisions that were made on set – decisions that were made without benefit of the CG elements to be added in post-production. In contrast, the development of an animated feature is more of an ongoing process because the same people are handling the work all the way through.
All of those other departments mentioned earlier are part of the CG world. There is no concept of ‘plate photography’, and production blends more fluidly from prep through final delivery.
For both feature films and animated features, post-production means shot production and final delivery. While many of the tools used in both cases are the same: 3d lighting and animation packages like Maya, and Modo, compositing packages like Nuke, and effects software like Houdini, there are significant differences to the workflow. In the case of an animated feature, shot production has already been going on for months and has been intertwined with many of the earlier phases of production.
In a VFX heavy feature, post-production is when the look of all the CG elements are finalized, the CG performances are authored, and shot production both begins and ends. Incorporating CG elements into live-action plates is a significant challenge as the plates can only be modified to a limited extent before they no longer hold up. Creative compromises often need to be made in order to make a plate work and thus avoid a costly re-shoot. It is here where the required skill set of feature film VFX artists diverges from CG artists working on animated features. While there is still significant overlap, the peculiarities of understanding and replicating plate photography are unique to the VFX side of the industry. And although the VFX artists are constrained by the decisions already made by the on-set department heads, the team supervising the CG work in an animated feature sets the look for the film. They have full control over the lighting, performances, assets, and environments. They can modify any or all of those components as needed while the work on the film progresses. Rather than a project that is split into three disparate phases (pre-production, plate photography and post-production), the CG animated feature film can be worked on organically and more importantly, holistically, as adjustments can be made to any or all components all the way through the development of the film.
As Virtual Production becomes more prevalent in the film industry, the lines between animated feature films, and live action feature films with extensive VFX are becoming blurred both in terms of tools and approach. Films like Disney’s The Jungle Book and The Lion King with VFX and Animation by MPC, and episodics like The Mandalorian, which all used Virtual Production extensively, no longer fall neatly into one category or the other.
Virtual Production is basically defined as any methodology that allows you to blend live action footage and CG in real time. CG elements are rendered with real-time camera tracking using game engines like Unity and UE4 and then live composited with other elements: plates, actor performances, etc. This blending can be visualized in different ways, sometimes through VR headsets or by displaying it on giant walls of high-res LED screens. Virtual Production allows the filmmakers to actually see what their final shots are going to look like without waiting until post-production. Because the rendering engines work in real time, the filmmakers can receive immediate feedback and adjust lighting, performances, or any of the CG components on the day and thereby get a more seamless final result.
With sophisticated pre-visualization elements and aggressive implementation, a production team can wrap months of principal and second unit location/stage photography with a film that is ready for screening. While there is still more ‘finish’ CG work to be done through a standard post-production workflow, the previz digital assets are providing significant visual value to the developing film. Prior to the development of Virtual Production tools, it could take months of postviz and temp work to fill in enough of the VFX to be able to screen the film. There is a common misconception that Virtual Production cuts down on the overall amount of CG work needed for a film, but in fact it just moves a lot of that work from post-production to pre-production so that by the time you get to shooting the film, a lot of the CG prep work has already been completed. This allows live action filmmakers to approach the CG work more like an animated feature. The DP can put on a VR headset and walk around the CG environment to place lights and lens the shots. Multiple department heads can meet in the VR environment to brainstorm ideas and try different setups. Actors can see the environment they are supposed to be in, and the other characters they are supposed to perform against. The filmmaking experience becomes more fluid and is less segmented into individual phases of production.
As the tools continue to improve and more and more films employ Virtual Production techniques, the more crossover we will see between animated and photoreal VFX features: crossover of artists, tools, and working methodologies. We’re at the very beginning of this evolution in the industry and it will be exciting to see what filmmakers are able to achieve with this technology.