The Four Technologies Used in Virtual Production

Written by Sandra Cota Martinez

Cover photo source: VFX Voice, courtesy of Lionsgate, Epic Games and Alex Nice.

The visual effects (VFX) industry has come a long way in the past 30 years: from the implementation of a fully computer-generated character in the “Young Sherlock Holmes” movie released in 1985 to unbelievable full-environment productions such as the live-action “Lion King” in 2019. The evolution of VFX technology has mainly been driven by large studios and the creative minds behind the stories who, upon seeing all the potential behind the powerful engines and the exponential growth of demand, have even created in-house labs to develop its full potential.

Figure 1: Video explaining virtual production and its impact on storytelling. Source: Unreal Engine’s YouTube channel.

Studios have discovered the importance of better production pipelines that create a more collaborative environment between directors, gaffers, and artists. This way, they can bring all of their ideas together to create a shot that is going to be much closer to the final results without having to run it through post-production multiple times.

This article will analyze four technologies that have been created to tackle these needs as well as how they workwho is implementing them in their productions, and how they will disrupt the industry.

Virtual Production Technologies

This research will focus on four main technologies and how they work together to bring virtual production to life. These technologies are the LED display walls, Unreal Engine, motion-tracking cameras, and virtual set scouting.

LED display walls

The LED display wall technology came into public knowledge after the massive success of “The Mandalorian.” When ILM released the behind-the-scenes for the series, they mainly focused on showcasing their LED display technology, also known as The Volume. By allowing the visualization of real-time rendered virtual sets, the actors and directors were able to directly interact with the environment and provide a better finished product.

Figure 2: Video discussing virtual production as used in “The Mandalorian.” Source: ILMVFX’s YouTube channel.

This technology is replacing the traditionally used blue and green screens. In the common pipeline, the shots would need to be sent into the VFX department to get transformed into the expected environment and to fix the lighting of the plates to make them match with the requirements of this new world. By having everything change directly on the LED screen, the reflections, background, and props are experiencing this lighting change and matching perfectly without the need for the extra steps of the original VFX pipeline.

Unreal Engine

This LED technology would never be possible without the power of Epic Game’s Unreal Engine, the “most powerful real-time 3D creation platform.” Not only is photorealistic rendering and ray-tracing in real-time possible, but there is also the added bonus of being able to have accurate parallax between the camera and the objects while filming. Another extraordinary feature is that by using Unreal, a video game engine, production studios can have a CGI character and have the animation play during the scene, giving the actor a proper action with which to interact.

Motion-tracking cameras

Figure 3: An exploration of real-time, in-camera VFX. Source: Unreal Engine’s YouTube channel.

Next, we have motion-tracking cameras. As with Unreal Engine, their use is crucial to making LED screens as powerful as they are. They are the other half of the technology needed to have the proper parallax on the scene, reading in which physical point the camera is located on and translating this information into a 3D camera that feeds into Unreal.

That way, the engine knows which information to project and how the light should be reflected depending on this position. Also, it saves the information so that if it needs to go into post-production, the camera’s information can be sent directly.

Virtual set scouting

Figure 4: Video explaining virtual set scouting. Source: The Third Floor’s YouTube channel.

The last technological component of virtual production that I will address is virtual set scouting. The main purpose of this technology is to create a 3D environment of a set that hasn’t been created yet in real life and to make sure that every element is ideal for the production. Using virtual reality, the director and the crew can stand in this computer-generated environment and pre-visualize the shots they are planning on creating. That way, if there is any change needed, it can be discovered before the actual set is built and save time and money for the production.

Technology implementation

The adoption of these technologies has been on the rise, mainly after the Covid-19 pandemic. The possibilities of having virtual environments that could be controlled remotely, while letting directors, photography directors, and artists work together, has been incredibly useful during these times.

These technologies have been explored mainly in bigger productions. As shown before, shows like “The Mandalorian” and “Game of Thrones” were the pioneers in some of these technologies. Now, with its spread, we have seen multiple new productions take advantage of it. There are already breakdowns and articles for movies like “The Midnight Sky,” “Thor: Love and Thunder,” and “The Batman.”

However, there are also smaller studios that are starting to implement virtual production technologies in their pipeline. An example of this is El Ranchito VFX, who rebuilt Valencia’s City of Arts and Sciences for the third season of “Westworld” for HBO. It used laser scans and photogrammetry to recreate the complex in Unreal Engine. Having the environment in 3D solved the limitations the set had and helped carry the director’s vision.

Disrupting the industry

There are multiple reasons to believe that these virtual production technologies are going to disrupt the entertainment industry as it is today. For starters, they’re bringing a much more convenient and advanced method of pre-visualization of the shots and a more precise way to carry the director’s vision. The success of shows like “The Mandalorian,” which had between 40% and 50% of its on-set capture shots be final shots, will push other big production companies to follow its path. The advantages of also decreasing the production time and cost and creating an environment that can be modified on the go are also great incentives for this to happen.

However, although these techniques are here to revolutionize the VFX industry and strengthen virtual production, this does not mean that the current pipeline will disappear any time soon. The extra value added from these advancements is not needed for every kind of shot. When approaching a new production, those in charge need to be knowledgeable enough to be ready to select the right approach. Another reason for it to not popularize quickly is that the cost of using these technologies is currently really high, and although it might be tempting to hover into the newest technology, the cost of using it might not be worth the investment.

Conclusion

The use of virtual production is going to keep rising as time passes. With all of the benefits it entails, more and more shows, movies, and even advertisement productions are going to start relying on its benefits. This means that, with time, the demand for these technologies will increase and the addition of new developers will help bring the cost of production down. The technology will become a common practice, and visual effects artists will be trained to perform their duties in real-time on the set.

Artists' jobs will not be directly impacted. What will happen is that, instead of being focused on post-production, their skills will be needed during pre-production and production. They will need to be more assertive and quicker in the implementation of change, but that's a skill that would be developed during their training.

+ Resources