Cast & Crew Blog

On the Horizon: Virtual Production and Entertainment Part One

Written by Michael Consiglio | Sep 21, 2022 5:55:00 PM

Traditional projects begin during preproduction with sights set on the light of postproduction at the end of a creative tunnel. But what if there were a better way? Or, at least, a different way; one that allowed for increased versatility and creative review? That’s the idea behind virtual production, a process that folds the timeline onto itself, starting with VFX work and making it possible to keep refining every manufactured element during the full production lifecycle. At first, the term “virtual production” feels like it implies any number of things. Computers? Animation? Newfangled technological wizardry? Well, yes to all three. But in truth, virtual production (we’ll call it “VP” for short) is a complex idea with far-reaching implications. The process itself speaks to the industry’s growing interconnectedness with technology. By integrating digital processes and tools earlier (often at the conceptual stage) and then leveraging them throughout the full production lifecycle, today’s filmmakers can enhance the ways they create their magic. If you think of the standard production timeline, which dates to the industry’s earliest days, it has always been (mostly) linear. The look of the thing. The feel of the thing. VP gives creatives the ability to refine every element of their work at every step of the process.  

Imagine you're making a science fiction film in 1960. Wonderful set builders have erected a spectacular spaceship for your cast to inhabit. The windows are likely either still matte paintings of the galaxy or green screen to be replaced in post, so everyone on set has to imagine what the finished film might look like. Fast forward to today, and everything you see out those portholes is already available on set. The stars twinkle, other spaceships fly by, and your cast looks outside with wonder as if they are truly spacefarers. Any virtual production process will combine both physical and digital elements by using a variety of hardware and software, such as giant LED walls that are photographed with a camera. The technology takes advantage of powerful processing tools to create a real-time virtual scene for filmmakers to capture, whether for something as simple as stars in the sky or complex as the open ocean for a show set on the water—which was the case with Netflix’s production of 1899. The series, which also features a massive steam ship that doesn’t exist in the real world, was set to film all over Europe before the Covid pandemic hit. When plans had to change, virtual sets were the answer, bringing those European locations directly to set. Of course, that wouldn’t have been possible without software powerful enough to dynamically process these environments—software like the Unity Engine, Helios, and the very popular Unreal Engine. 

Epic Games’ ubiquitous Unreal Engine is currently the most popular software in the virtual production arena. The system got its start as a games development platform, but being extremely versatile, it has been used for everything from visual effects to architectural design work. Like Unity and other engines, it is the means by which effects artists can build a complete world that renders in real time, including elements like light, shadow, and organic matter (like people and plants). It essentially runs everything on set. In essence, this creates a real-time feedback loop for filmmakers, allowing active creative participation from moment to moment.  

When we talk about the current state of virtual production, we’re generally describing one of four use cases. First up, there is visualization, which describes using 3D VFX assets to envision and plan the work ahead. The most common use in this context is with previsualization—planning the full scene in multidimensional detail before filming begins. Despite what you might hear from experienced VFX professionals, these services have yet to be fully adopted or widely available. To date, only a handful of VFX providers and studios have earned reliable reputation for providing services in this area to Hollywood’s major players. If you’re making a giant blockbuster or tentpole, there are only so many VFX companies to previs your work. 

Then, there’s performance capture, which describes the capturing of actor movements for use in animating VFX assets. As we mentioned in our piece on volumetric capture, versions of this technique date back to the early motion capture days of the 1980s. Over the years, the technology involved has improved considerably, with smaller, less cumbersome hardware and increasingly automated software. While still incredibly challenging, great artists and technicians have been able to make great use of performance capture in films like 2017’s War for the Planet of the Apes and the ever-popular Game of Thrones. 

Next up is hybrid camera, which allows filmmakers to composite digital VFX with live-action camera footage in real time. As commonplace as composite work has been in mainstream cinema (whether it’s blue screen, green screen, or even the modest superimpositions of 1903’s The Great Train Robbery), there are certain unavoidable compromises that must be made when shooting things piecemeal. Both performers and filmmakers are forced to work in the moment while doing their best to imagine the final product. Some studios prefer this vagueness, as it helps preserve a sense of surprise in the final product for everyone—including the cast—but most performers would likely speak to the frustrations of acting against a scene partner that won’t be added until post. Weta’s Simulcam—a fusion of a real and virtual camera that was developed for use on James Cameron’s Avatar—gave creators the ability to instantly superimpose actors onto a virtual simulation, providing a better overall understanding of the scene as a whole. Every department had eyes on some semblance of the final product, and actors benefited from a preliminary view of their fully formed character performance. 

The most popular use case is currently LED live action, which brings us back to our 1960s space flick example from earlier. In this increasingly common circumstance, LED walls (made up of hundreds of panels) with finished high-quality effects are used in place of green screens. Performers are then shot against these screens, and everything is captured at once on camera with no need for additional compositing. This method is a natural extension of 2D video screen projection techniques that the industry has used for decades, though LED now allows filmmakers to cast light for accurate reflections. And since the images on the LED wall can be continuously refined and tweaked, it allows for more versatility on set. In fact, when 2022's The Batman was forced to shut down due to the Covid-19 pandemic, filmmakers opted to move indoors for certain scenes, bringing the brilliant Gotham City skyline to life with the help of an LED wall. 

On the surface, these advancements may come across as the newest, fanciest way to use high-tech toys to make movies. However, the minutiae tell a larger story. If you’re a filmmaker trying to breathe life into a scene, you need control over every element. Virtual production provides a dynamic parallax effect, ensuring that the images on a virtual wall shift perspective as the camera does, creating the illusion of a true location on a controlled stage. With live camera tracking, a director and cinematographer can adjust their shots however they choose, and software like Unreal Engine will track the physical camera and adjust the LED imagery in a way that translates flawlessly through the lens. In a somewhat ironic turn, all this technological advancement has provided compositors with a more traditional approach to capturing a natural image within the camera itself. Shoot up, shoot down, shoot closeups or wide coverage … the LED wall and its software systems react in a way that makes classic approaches feel woefully inadequate. 

Using an LED wall is also a much better way to get a clean composite. They are better for reflection than their green screen counterparts, in which objects often lead to accidental “green spill” in the image that can be quite difficult to remove. The lights emitted by these LED walls render in real time, so they interact with objects on screen as if they are truly there. If you were to hold a shiny silver ball on your theoretical spaceship set, you would instantly see the virtual star field created by the LED wall reflected on the ball’s surface. It would move as you move and it would be captured by the camera as a fluid, natural image. No post editing work needed. Picture this sophisticated setup on a large scale and you get LED immersive soundstages like the one at Manhattan Beach Studios in California, which earned the nickname “The Volume” by hosting shows like The Mandalorian and Obi-Wan Kenobi (not to be confused with the motion capture stage we discussed in our piece on volumetric capture). This large, enclosed stage made up of LED walls and ceilings provides the most advanced modern setup for better performance capture and a virtual camera rig that allows filmmakers to see their digitized actors in complex fabricated environments. Since 2009’s Avatar (a prominent adopter of early versions of this setup) was finished, VP technology has made huge advancements. Now, we’ve swapped flat ceilings for giant domed ones that help avoid feelings of disorientation that older models caused in crews. 

In our blog, The Future of Work in Entertainment, we mention that Disney+’s The Mandalorian was one of the first major shows to embrace many elements of virtual production, using a spectrum of computer-aided techniques and visualization filmmaking methods to reinvent their shooting set and the ways the project was put together. Like others of its ilk, the Star Wars spinoff used these giant LED screens and Unreal’s advanced game engine platform to bring higher production value (and more control!) at a smaller cost to the project. 2018’s “live action” The Lion King used this same kind of virtual environment to maintain complete autonomy over their constructed world, saving time and money by providing things like complete previsualization and simultaneous editing. Of course, the practicality of these improvements means very little unless production has the workflow to provide all necessary support. So, what do these new set environments require from a technological and support perspective? What does one need to make virtual movie magic? Find out in part two of our deep dive...