Virtual production is revolutionizing the way we produce films and television, but how does the complex technological process feel different to crew members performing their daily jobs? If you’re curious what this technical wizardry looks like on set, the advanced software of Unreal Engine requires the presence of multiple local computers, as there is clearly a great need for processing power and technical flexibility. Additionally, you’ll find a tracking machine that processes the live movements of all cameras and elements. Generally, the stage is managed by a control team, playfully dubbed “The Brain Bar,” who oversee volume activity and communicate specifics to the production at hand. Acting as a sort of nerve center, you’ll find a primary Unreal Engine device making live modifications to the digitally generated environment. This hub works in conjunction with things like render machines (which put tracking pixels on the walls and other varying surfaces), live compositing machines (to render green screen when used), and any additional computers needed for proper virtual scouting in the generated world. It is an incredibly complex workflow that requires every element to not only work well but work well together. Fortunately, virtual production is a holistic and efficient enterprise.
Hoo-boy, that’s a lot of extension cords and power strips. To produce a singular virtual environment, these systems must stay in sync. They accomplish this by using things like the Multi-User Editor (which allows multiple artists to work on the same virtual environment in real time) and LiveLink (a plugin for Unreal that minimizes rendering lag by streaming real-time data to those machines that require it). These two systems are often used together on set, as they both provide unique strengths and weaknesses. Multi-User Editing, for example, lets teams collaborate in the moment, building virtual worlds through shared creative sessions. LiveLink’s low latency, on the other hand, allows for a much speedier workflow, letting filmmakers make decisions more quickly. Of course, none of this works without powerful software like the aforementioned Unreal Engine (or one of its ilk) running the show. But fueled by the adaptive power of these engines, VP expands the scope of limited productions and pushes the boundaries of previs with its ability to render final pixels and simulate photorealistic images in camera with little to no postproduction augmentation. With its adoption, production crews can find ways to solve complex technical and creative problems in pre-production and on set rather than relying on the old adage: “we’ll fix it in post!”
How prevalent is virtual production in the modern entertainment industry? Let’s look at the numbers. In 2021, the global virtual production market was valued at $1.6 billion, and it’s expected to expand at a compound annual growth rate of 17.8% from 2022 to 2030. To date, there are roughly 100 LED stages around the world, with at least 150 new ones currently being planned. In the U.S., virtual production has traveled coast to coast. In San Marcos, Texas, the Hill Country Group announced their plan to build a $267 million virtual production, TV, and film studio—the largest of its kind. Courtesy of Vu Technologies Corp., Florida’s new 32,000-square-foot studio is nearly ready for business in Orlando. As of today, Vancouver, Canada is home to the largest VP stage in the world, with 2,500 LED wall panels and 760 LED ceiling panels. Across the pond, Garden Studios in London is set to create their second VP stage by the start of 2023—this after their first stage hosted 40 productions in under 15 months. The studio very much sees this as the future of production, going so far as to plan a VP training center for student filmmakers. The French are going virtual as well; Paris-based studio Plateau Virtuel recently upgraded their extended reality (XR) facilities with SMODE, a real-time compositing/media server platform. Germany is already using LEDcave in Berlin for car commercials. In South Korea, a new virtual production stage at the CJ ENM Studio Center cost 200 billion won (about $153.7 million) and sits on about 5,000 square meters. Clearly, this is a trend with global reach.
Despite the complexity of its systems, VP is actually much more efficient and easier to manage in comparison to traditional production practices that demand endless postproduction fixes. While not yet widely available, the benefits of this technology can be found in the smallest details as well as the largest financial figures. From a filmmaking perspective, VP can improve storytelling, allowing for a broader scope of production and near-infinite storytelling resources (Want to set your flick in the middle of the ocean? You can! How about on Saturn? Go for it!). It provides complete control over everything that’s put on the screen. It can help resolve ambiguities about what a final product “might” look like and unlock fresh possibilities, helping filmmakers bring audiences to new planets, realities, and time periods. It also allows for a global network of collaboration and remote work, with increasing numbers of people playing a part in decisions that might have once been made entirely in the postproduction stage. Now, these decisions are made on set in real time. A director can supervise every shot in its mostly completed form as it is captured by the camera. These LED stages help reduce the time it takes to set up shots, prepare for scenes, select lighting and camera angles, and the like—everything can now be done through a switchboard. From an actor’s perspective, all performers are now able to visualize the scene as it will be presented. By cutting unnecessary guesswork, actors who are charged with bringing truth to a scene can now find themselves on the same page as their peers. They’ll all see the same thing in the moment, rather than imagining what something will look like two years down the road. For studios, production companies, and other stakeholders, these volumes help reduce costs. A better system of planning a shoot can reduce the need for expensive (and extremely common) reshoots, which traditionally account for 5—20% of all final production costs). VP artists are already building asset libraries that will allow projects to recycle elements (like a desert, jungle, or spaceship) from one virtual environment to the next. This all leads to increased efficiency, with an accelerated “go-to-market" timeline that aids a much faster release schedule.
In terms of sustainability, the benefits are quite obvious as well. In 2019, albert (BAFTA’s industry-backed sustainability project) reported that the production of one hour of television resulted in the creation of 13 tonnes of carbon dioxide. Computer-generated elements like sets, props, and characters, would significantly reduce quantities of material waste. And while 24% of a feature film’s carbon emissions are a direct result of air travel (according to the Sustainable Production Alliance), that figure could be almost non-existent with VP’s LED walls removing the need for crew moves. Moreover, VP adds an element of democratization to production overall. With high-end digital effects more readily available to younger filmmakers (who could use projectors rather than LED screens), storytelling scope will no longer be limited by one’s purse. Good stories can be told by good storytellers with exceptional production value, regardless of the money they have at their disposal.
So, VP is all upside, right? Not exactly. Any technology of this magnitude will certainly come with some limitations. That’s where we’ll kick it off next time in part three of our virtual production deep dive...