Virtual Production in Television and Beyond

This is Part II of a two-part series about virtual production in the film industry. The research in this post comes from a capstone project by Matthew Bernstein, Yingtong Lu, Feng Qiu, and Eesha Rashid, Master of Entertainment Industry Management students at Carnegie Mellon University.

Introduction

As discussed in Part I of this series, virtual production (VP) is the unique intersection of physical and digital filmmaking that blends video game technology with filmmaking techniques into the pre-production and production process. VP is not just the technology that allows the integration of digital and traditional techniques, but also the methods through which it is used. If implemented in the TV industry, VP could lead to reduced costs since it allows for increased efficiency in the production process. Many other industries could also learn from this streamlined process, including live entertainment and the arts.

Why Television?

There is a higher demand for content than ever before, and television studios are scrambling to create must-see TV. Figure 1 demonstrates the most watched TV shows in the fourth quarter of 2019 in the United States.

Figure 1. The most watched digital originals in the United States across all streaming platforms. Source: Media Play News.

Figure 1. The most watched digital originals in the United States across all streaming platforms. Source: Media Play News.

The shift in consumer taste for prestige television has sparked a wave of content that has increased in creative scope and cost. In 2019, Netflix alone increased the number of TV originals from 2018 by 54%, adding 371 new television shows to their library. Netflix executives are estimated to have spent nearly $15 billion. The cost of individual shows is also rising as seen with Disney+, which spent $15 million per episode for The Mandalorian and is expected to spend even more on the upcoming Marvel shows. Marvel entries WandaVision, The Falcon and the Winter Soldier, and Hawkeye are pegged to cost closer to $25 million per episode, if not more.

The vast majority of our interview candidates, as well as our survey data, indicate that the increase in cost, the greater creative demand for innovative storytelling, and the immediacy and high churn in television production primes television as the medium with the most to gain from VP technology. Producers can do the world-building in advance of a virtually produced series, then amortize upfront costs across all episodes and reuse the same characters and locations to reduce costs by as much as 30% per episode. In general, a television production crew has approximately one week to complete one episode of television from start to finish. Producers would greatly benefit from having cameras and LED walls that use real-time rendering engines. Furthermore, because of how television budgets are structured, production teams will often shoot an entire episode of television at two or three locations. If those productions had access to LED projection monitors and virtual cameras, they could have access to as many sets as they need. The Mandalorian leveraged this technology, significantly reducing costs because they were able to render effects and backgrounds ahead of time. But while television has the most to gain from adopting VP tech, it must employ this technology while still curating an engaging story and grandiose scale and scope to utilize it.

Impact of Virtual Production on Development

The development process encompasses “the creation, writing, organizing and planning stage of a project” and contains more narrative and logistical deliverables that are ambiguous.[1] VP has a considerable identifiable influence across multiple sectors in production and post-production. The impact of VP on the development stage of filmed production is the most equivocal and intangible. The introduction of VP technology amplifies the possibility of what could be illustrated in the screenplay and story, but whether that story was informed by VP or vice versa cannot be quantified. The only tangible influence VP has on the development phase is when pitchvis (“pitch visualization”) is used to help producers and writers visualize their idea with more creativity. This ultimately helps the studio determine if they want to green light a project and commit to financing the final product.

Impact of Virtual Production on Production

The most significant benefit VP provides to filmmakers is that it gives everyone on set a clear idea of what the final product will look like. On most sets that incorporate computer graphics, actors perform in front of a green screen and react to imaginary scene elements that are incorporated into the film during post-production. As mentioned earlier, VP allows actors to react to fully rendered assets projected onto an LED monitor behind them as opposed to imagining them on a green screen. The Mandalorian also utilized a moving camera that processed images in real-time using the Unreal video game engine. A cinematic director who worked on Halo at Microsoft Games stated that because everyone sees “… what the face is going to look like, what the bodies look like, what the environment looks like, everyone has a better sense of… what the final product is going to look like.” This does not just pertain to cinematographers and actors, but everyone supporting these roles as well. All parties involved see the same thing and can react accordingly.

Some of these tools also enable more creative freedom on set. A manager at a previsualization, postvisualization, and virtual reality company spoke about how having access to simulcam on the set of Real Steel (2011) enabled her team to incorporate a dramatic shot of Hugh Jackman diving and punching: the shot “that made it into the movie… would [not] have had he not been able to see [the robot] in simulcam form and look at that playback.” In most cases, scenes shot using VP match the visual quality of those from a final render. One post-production company has developed a toolset that uses VP to create visual effects in real-time with accurate reflections and lighting. Their system uses photographic plates and 8k cameras to create synthetic and photorealistic CG elements. When combined with Unreal Engine, they are rendering at 60 frames per second. As the president of the company explains:

That is a lot of data that has to get through and be processed in real-time…what the Unreal Engine allows us to do is put everything in one basket and shake it up, and when it comes out, it looks real…3D, 2D, real-time color, tracking, off-axis rendering, all these things ... happen in real time... we can be adjusting color on the fly, we can be adjusting the lighting on the fly, we can be doing ray tracing on the fly.

This facilitates a more streamlined filmmaking process for film and television productions that have a clear idea of what they want because it takes out the guesswork of shooting something on a green screen and adding set elements during post-production.

While the advent of VP allows for a streamlined physical production process, it also means that on-set roles will need to evolve alongside this technology. For example, productions will not need as many gaffers when lighting comes only from an LED screen and can be adjusted on a virtual set from a handheld tablet. The consensus seems to be that roles will change and begin to overlap with others rather than disappear entirely. An executive within a virtual production and emerging technologies department cited foley artists as an example of a role that has not disappeared despite the evolution of audio mixing technology. Cinematographers, for example, will have access to a more robust set of tools than ever before, but this means that they must learn to use these tools if they are to continue working. Another executive in the virtual production and emerging technologies department explained, “those people still have very similar jobs, they are just going to have to get good at using different toolsets. The skill is still valid, but it’ll morph into something else… the role will evolve.”

It is important to keep in mind that the guilds that represent these positions will need to adjust and grow with the technology. An executive at a video game and software developer told us that the role of the guilds has been, and will continue to be, to help their members “get on with the times. Most of the time, guilds see it as part of their vision to help their members adjust to new toolsets.” As real-time rendering gets closer to resembling the visual acuity audiences expect out of a feature film, we will not be removing on-set positions, but rather make those peoples’ jobs “more like what they were taught in film school,” according to an executive at a previsualization, postvisualization, and virtual reality company. Rather than shooting something and glossing it up with special effects, the raw footage shot in the Unreal Engine matches the visual quality of something shot using traditional equipment and edited after the cameras stop rolling. Assuming the filmmakers are familiar with VP technology, this cuts down on how much time a given project needs to spend in post-production and results in a streamlined filmmaking process that resembles the fluidity of making a student film.

How Will Studios Adapt to Virtual Production?

While VP reduces the overall cost of film and television production, it is not because the technology itself is less expensive than existing technology. Rather, the cost savings come from the improved efficiency that VP enables. If used effectively, VP technology can lead to long-term and cost-saving solutions because it allows greater creative control upfront and generates frames with a high level of accuracy. A virtual production engineer says that while costs go down, VP is not seen as a cost-saver, but rather something that “allows frequent iteration, higher value product, for roughly equal-cost.” Because of this, the financial implications of VP stem less from whether a studio uses VP and more from how it is leveraged. According to an executive at a video game software development company, “One of the hurdles virtual production must get over is this misconception that it costs less—what it does is enable more creative freedom early on in the filmmaking process when overall costs are less, but it’s not actually cheaper.”

Another revolution for studios is the functionality of repurposing assets for various campaigns and endeavors. According to a head of film development, “Eventually, an asset that audiences see on-screen at a movie theater will be the same asset used in marketing materials, in video games, in the TV series adaptation,” so companies can use the same asset in a marketing campaign as was used in the feature film. Using VP will also create more accurate visualization footage allowing studios to test early versions of a story accurately with target audiences. Studios can accurately gauge how the film plays with test audiences prior to the commencement of principal photography because “… you’re developing the story in a viewable form earlier on, you can market test your content… in a manner which is digestible for an audience,” according to an executive at a video game software development company. Furthermore, addressing conflicts earlier in the development process limits the necessity for costly reshoots once on set.

Applications to Other Industries

Beyond investing in education, studios should look to other industries to learn how to further advance this technology. Cloud gaming is one such technology. The same technology that Google uses on its Stadia cloud gaming service could provide filmmakers with more cost-efficient and creative advantages over the traditional production models. Companies need far less physical infrastructure, such as workstations and render farms that usually require thousands of computer servers to operate. Adopting cloud computing also improves efficiency because everyone is working on the same file in real-time instead of saving the files onsite and putting them together later. On the creative side, artists can respond to feedback earlier in the process of physically creating an asset. This grants smaller companies the ability to render files of the same size and complexity.

Additionally, in the wake of global disruptions such as Covid-19, studios and productions are trying to incorporate safer practices. VP can provide social distancing precautions for already reduced crew sizes, although remote projection such as robotic cameras are more likely to have a greater impact than VP. That being said, innovations in both could converge. In the wake of Covid-19, studios have used quarantine to explore what VP has to offer and “are taking the opportunity…[to] look at incorporating virtual production especially [in] an LED workflow” according to an entertainment production consulting firm. These factors illustrate that VP will eventually reach a point where it maintains a high quality and becomes commonplace.

Furthermore, VP technology is being implemented across different aspects of the entertainment industry and evolving alongside industries such as music, live television, and live events. In-camera, real-time virtual effects (VFX) technology, and unreal engines are not only used in filmmaking, but also used in fields such as architecture, military, and the auto industry. For example, archviz (an adaptation of techviz, or tech visualization, used in architectural modeling) allows architects to use the Unreal Engine to visualize their decoration plans in real time. Car manufacturers can also use the Unreal Engine and in-camera real-time VFX to conduct test drives. The impact of other game-related technological advancements and VP’s impact on industries beyond film and television will continue to surface as people outside the film industry learn more about the technology. While many arts organizations are presenting performances digitally due to the Covid-19 pandemic, it could be an opportunity to experiment with real-time virtual effects.

Conclusion

Garnering interest in technology is key to building the use of technology. Advancing and further implementing VP technology hinges on building an educated workforce, so those who are passionate about it should invest in teaching filmmakers how to use it successfully and to its fullest potential. VP technology has the potential to dramatically improve the speed and efficiency of the film and television-making process. Real-time rendering engines and LED walls have already drastically decreased the amount of time required to deliver a finished product to a studio. But there are several hurdles that VP must overcome. The tools that VP offers are difficult to learn, and there are still concerns about how much it costs, how long it takes to effectively implement, and whether every production needs to use it. However, there is substantial evidence to support the claim that, when used effectively, VP reduces the cost and time required to make a filmed project and enables greater creative freedom. It is only a matter of time before VP becomes a central facet of the film and television-making process, and experts predict that these technologies will be adopted by nearly every major studio within the decade.

[1] Webb, M. (2019, December 17). Understanding the 5 Stages of Indie Film Production - Indie

Film Hustle. Retrieved from https://indiefilmhustle.com/5-stages-indie-film-production/