AMT Lab @ CMU

View Original

An Introduction to Virtual Production and Its Use in the Entertainment Industry

This is Part I of a two-part series about virtual production in the film industry. The research in this post comes from a capstone project by Matthew Bernstein, Yingtong Lu, Feng Qiu, and Eesha Rashid, Master of Entertainment Industry Management students at Carnegie Mellon University.

Introduction

Virtual production (VP) is “a broad term referring to a spectrum of computer-aided production and visualization filmmaking methods” (Spectre Studios). It is not just the technology itself, but also the methods through which it is used. In Parts I and II of this series, we are investigating the technology as well as the methodologies of VP in the broadest definition. The types of VP we will discuss in this post are visualization, performance capture, and LED walls. While VP has benefited the film industry specifically, as innovation with the technology continues, other fields of entertainment—such as live entertainment, fashion, and news reporting—are also leveraging virtual production’s capabilities. The arts and entertainment sector could benefit from learning more about these methods and considering how they could apply these technologies to create engaging virtual experiences.

What is Virtual Production?

Virtual production (VP) is the unique intersection of physical and digital filmmaking. VP blends video game technology with filmmaking techniques into the pre-production and production process. Its earliest use and iterations can be traced to advancements and innovations in filmmaking technologies. Peter Jackson’s The Lord of the Rings: The Fellowship of the Ring (2001) used virtual reality (VR) goggles and virtual cameras to plan camera moves. James Cameron took it a step farther with Avatar (2009), as seen in Image 1, to create a bioluminescent species and exotic environments with a motion capture stage and simulcam. Simulcam is a VP tool used to “superimpose virtual characters over the live-action in real-time and aid in framing and timing for the crew” (Kadner 2019). Jon Favreau continues to lead the charge with ground-breaking film and television projects such as The Jungle Book, The Lion King (2019), and The Mandalorian (2019), by designing and altering photorealistic environments in real-time. As technology continues to innovate, other fields of entertainment are also leveraging virtual technology’s capabilities, such as in live entertainment, fashion, and news reporting (Image 2).

Image 1: Facial and motion capture in Avatar. Source: ComingSoon.net.

Image 2: Virtual production used in live reporting. Source: KnowTechie.

Virtual Production Changing Workflows

When we consider the traditional filmmaking pipeline, the process is strictly divided into development, pre-production, production, and post-production. While development is an ongoing process, each phase follows an assembly line process and is driven by the strict deliverables required to complete a filmed project. There is little fluidity and iteration between each phase without significant opportunity and financial cost. Each phase comes with a level of uncertainty and inconsistency for filmmakers. The Virtual Production Field Guide states, “The Iteration process in traditional production is wasted on finding a common starting point instead of refining an agreed-upon shared vision.” For movies that utilize heavy visual effects and computer-generated imagery (CGI), directors are forced to direct without knowing what the characters will ultimately look like, and there are often color inconsistencies between physical production and post-production when using green screen. Furthermore, a lot of the visual effects and animation finalization is deferred to post-production. By using VP tools, however, cinematographers can gauge color and lighting with greater accuracy than with a green screen. Introducing digital assets via VP technology during physical production reduces the reliance on post-production and permits creative evolution and adjustments on set.

The introduction of VP enables freedom for iteration and experimentation earlier on in the pipeline than previous technology allowed. When filmmakers are equipped with VP technology, a lot of the refining that typically occurs during post-production can occur as early as the development stage. Digital and visual assets created during any stage are now cross-compatible across every phase of a filmed project. The assets can be modified in real-time and repurposed to any degree while maintaining visual consistency. Editors can employ complex visual effects and sequences in a manner similar to traditional non-effect scenes. The uncertainty that comes with pushing aesthetic choices to post-production is reduced, because the imagery created is closer, if not identical, to the final output. Figure 1 illustrates how filmmakers are afforded a more collaborative filmmaking process by using VP. Principal photography allows for adjustments while shooting instead of discovering problems in post-production, thus reducing costly reshoots. This permits a more collaborative and cohesive production process than technology previously permitted. Filmmakers are now equipped to accurately visualize details and use them to inform creative choices and experimentation. Ultimately, VP makes it easier for filmmakers to create a cohesive final product.

Figure 1: Virtual production changing the pipeline. Source: The Virtual Production Field Guide.

Types of Virtual Production

Virtual production itself “is a broad term referring to a spectrum of computer-aided production and visualization filmmaking methods. VP combines virtual and augmented reality with CGI and game-engine technologies to enable production crews to see their scenes unfold as they are composed and captured on set” (Spectre Studios). Epic Games, a video game software developer and publisher has applied its real-time engine, Unreal, to help filmmakers blend video game capabilities with filmmaking techniques. Their main competitor is the Unity engine. When technologies from these companies are combined with traditional filmmaking techniques, the filmmakers are employing VP. Consequently, VP is not just the technology itself, but the methods through which it is used as well. Various subcategories of VP tools employed today are broadly categorized as visualization, performance capture, and a full, live LED wall. They each have their own subsets of tools and methodologies. The underlying thread that connects these tools is the gaming engines used to operate them and the ability to render in real-time. Each of these technologies provides certain benefits and abilities at each stage.

Visualization

Most filmmakers are familiar with visualization, as it employs prototype imagery “to convey the creative intent of a shot or sequence” (Kadner 2019). Visualization manifests itself through a variety of steps, including pitchvis, previs, virtual scouting, techvis, stuntvis, and postvis. (“Vis” in each of these terms stands for “visualization.”) Pitchvis is used for a project in development to garner interest from studios and investors. Filmmakers now use real-time engines to visualize their creative intent through imagery and trailers. “Some examples of movies which were greenlit in part due to a trailer created as pitchvis include Godzilla (2014), Men in Black 3 (2012), World War Z (2013), Jack the Giant Slayer (2013), and Oz: The Great and Powerful (2013)” (Kadner 2019). In contrast, previs is employed by every major studio to experiment with details such as staging and art direction. It is often used to illustrate a scene and inform choices in stage direction, camera movement, and editing. The Third Floor, a previz, postviz, and VR company, used Unreal to render select scenes in Game of Thrones during pre-production (Image 3). According to Unreal Engine, “With multiple departments, VFX [visual effects] vendors, and units working on the show, the studio’s mockups helped define and communicate the show makers’ creative and technical vision.”

Image 3: Previz rendered scene made in Unreal Engine for Game of Thrones. Source: Unreal Engine.

Virtual scouting is a digitalized method of identifying locations and shooting within sets. Location scouts can browse through Unreal’s database to compare various regions and landscapes without physical travel. Crew members can interact with digital elements through a head-mounted display (HMD) such as virtual reality goggles or a computer screen in pre-production. In some cases, crew members will use VR and virtual cameras to build sets, shoot sequences, experiment with lenses, and plan out shots. This allows filmmakers to filter out redundant scenes and assets before they are actually created (Image 4).

Image 4: Location scouting with virtual reality in Game of Thrones. Source: Unreal Engine.

Techviz combines digital assets with captured footage to map out camera moves, placement, and lenses to gauge the physicality of virtual choices. Stuntvis is an extension of techviz designed to ensure that stunts are integrated accurately with live-action footage. Unreal has been used to accurately test the physics and choreography of various stunts. Finally, postvis integrates live-action with temporary visual effects as placeholders for computer-generated (CG) shots in post-production. As an example, “Halon Entertainment provided postvis shots on War for the Planet of the Apes by adding temporary versions of CG apes as well as military vehicles and artillery into many live-action plates, along with fully CG shots for shots intended to be completely CG” (Kadner 2019). The progression and final output can be seen in Image 5. This ultimately helps guide directors and editors with sequencing during production by having more developed shots to mark their scenes rather than working with partial sets.

Performance capture

Performance capture, as pioneered by films such as Avatar, records movements of objects and/or actors to animate digital models. It involves the use of markers to help capture subtle facial expressions and body movements in a live-action environment. This technique is called simulcam and is often used when virtual characters need to interact in live-action. The main subsets of performance capture are motion capture, facial capture, and full-body animation.

Image 5: Halon postvis with real-time rendering and final output on War of the Planet of the Apes. Source: IAMAG Inspiration.

Live LED wall

A live LED wall is VP in its purest form. An image-output is created with real-time rendering engines and projected onto a screen behind a physical set (Image 6). In contrast with using a green screen, talent can interact with and witness the scene exactly as the crew does as the sequence of events unfolds on the wall. Lighting and imagery can be adjusted on a whim, the final-pixel imagery obtains a level of photorealism that can be captured on camera, and expensive on-location shoots are not required. As cameras move, the perspective and lighting projected onto the LED wall shifts in relation to the camera’s position. According to The Virtual Production Field Guide, “All of the natural reflections and lighting from the screen provide important artistic cues and enhance the realism of the imagery, compared to the typical struggle to avoid contamination from the green screen’s color spilling onto the subject as well as creating unwanted reflections.” These features make the live LED wall an extremely versatile tool that enables filmmakers to obtain their creative vision in real-time. More and more productions are adopting these walls. Industrial Light & Magic’s (ILM) LED wall, StageCraft, was used to create the sets for over half the scenes in The Mandalorian.

Image 6: Live LED wall on the set of The Mandalorian. Source: Nerdist.

Conclusion

As we discuss the evolution of VP technologies and their impact on the pipeline, we must recognize that their utility and benefits vary by type. Much of the preexisting research explores the application of subset technologies on individual films and sets. Despite the various nuances each subset of VP—visualization, performance capture, and LED walls—exhibits, the underlying technology that shapes these subjects and their output is the same. Arts and entertainment leaders can learn from these techniques as many live performances pivot to a digital space.

Resources

Coldewey, Devin. “How 'The Mandalorian' and ILM invisibly reinvented film and TV production.” Tech Crunch, February 20, 2020. https://techcrunch.com/2020/02/20/how-the-mandalorian-and-ilm-invisibly-reinvented-film-and-tv-production/.

Desowitz, Bill. “'Avatar': Weta Begins Innovative VFX on James Cameron's FourPlanned Sequels.” IndieWire, July 31, 2017. https://www.indiewire.com/2017/07/avatar-weta-james-cameron-sequels-1201861720/.

Kadner, Noah. “The Virtual Production Field Guide.” Epic Games, July 25, 2019. https://cdn2.unrealengine.com/Unreal+Engine%2Fvpfieldguide%2FVP-Field-Guide-V1.2.02-5d28ccec9909ff626e42c619bcbe8ed2bf83138d.pdf 

Robertson, Barbara. “The Fellowship of the Ring.” CGW 24, no. 12 (December 2001). http://www.cgw.com/Publications/CGW/2001/Volume-24-Issue-12-December-2001-/The-Fellowship-of-the-Ring.aspx.

Virtual Production. (n.d.). Retrieved April 12, 2020. https://www.spectrestudios.com.au/our-tech.

Unreal Engine. “Virtual production on the battlegrounds of ‘Game of Thrones.’” March 18, 2020. https://www.unrealengine.com/en-US/spotlights/virtual-production-on-the-battlegrounds-of-game-of-thrones.

Weta Digital: Virtual Production. (n.d.). https://www.wetafx.co.nz/research-and-tech/technology/virtual-production/.