Ever since Disney+ series The Mandalorian began making news for its inventive use of an LED stage and virtual production pipeline for filming and incorporating real-time effects — en route to winning Emmys in VFX and cinematography, as well as a 2022 Engineering Emmy that was awarded to Industrial Light & Magic for the system’s development — these types of stages have become what is arguably the fastest-growing area of visual effects and production technology. But amid the explosive surge in LED stages, observers warn that the business, tech and creative models for these pricey installations require more understanding before their potential can be fully realized.
“We are tracking roughly 300 stages, up from only three in 2019,” Miles Perkins, industry manager of film and TV for Epic Games, maker of the Unreal Engine used in virtual production pipelines, reports of the rise in capacity stemming from investments from studios, stage complexes and VFX companies.
These installations include ILM’s StageCraft volumes at Manhattan Beach Studios, used on The Mandalorian and Lucasfilm’s upcoming Ahsoka. Marvel employed a bespoke StageCraft build in Sydney for Thor: Love and Thunder and relied heavily on ILM’s system at Pinewood Studios in London for Ant-Man and the Wasp: Quantumania. VFX company Pixomondo’s Toronto-based LED stage has a longterm lease from CBS and has been used for series including seasons four and five of Star Trek: Discovery. And more recently, Prysm Stages, an NEP Virtual Studios company, opened its inaugural stage at Trilith Studios in Atlanta, with Francis Ford Coppola’s Megalopolis as its first production. Meanwhile, NEP’s Lux Machina reports that its virtual production credits include movies from last summer’s Sony actioner Bullet Train to Warner Bros.’ Shazam: Fury of the Gods, due in March.
Still, as many stakeholders tout the promise of these virtual production stages, others worry that there are too many in the market right now — especially when demand for traditional soundstages is rising. “It was a big learning curve, and I think it worked really well for some moments and less well for others,” HBO’s House of the Dragon director Clare Kilner says. “It’s especially good for never-ending sunrises and sunsets, for example.” Yet the virtual production stage at Warner Bros. Studios Leavesden in London, which opened in 2021 and was used for Dragon, is already shutting down, The Hollywood Reporter has learned.
“Due to the high demand for studio production space, the virtual stage at Warner Bros. Studios Leavesden will revert to a traditional soundstage in order to provide more flexibility to our clients,” a Warners rep says. “Productions will still have the option to bring in virtual production technology as and when required.”
Oscar-winning VFX pro Ben Grossmann — whose company Magnopus was instrumental in the development of the virtual production workflows for Jon Favreau’s The Jungle Book and The Lion King — estimates that the cost to build an LED stage for virtual production can run from $3 million to $30 million. It depends on the size of the LED wall and the structural engineering — AC and power, support for the wall’s weight, additional lighting and camera-tracking technology, etc. — needed to retrofit a stage for this use. Plus, the content projected onto the LED walls for filming can be the most expensive element in the process; these complex, fully CG environments could require VFX artists to work four to six months on the task. “You could start spending a lot, very quickly, on the content,” Grossmann notes.
As with any new technique, there are early adopters and others who proceed with caution. “It’s not the be-all and end-all,” relates director-producer Jay Holben, who recently helmed an experimental test short (known as Standard Evaluation Material 2) for the American Society of Cinematographers that incorporated both locations and virtual production on an LED stage. He sees great potential for the process but points out that various pieces, including color and lighting, still need work. “Current LEDs don’t have the color spectrum,” he cites as an example. “That’s changing, but that’s not in the market yet.”
“A lot of people are walking in thinking they can just turn on the camera and shoot,” Holben says. “But if the light is not correct and the color balance isn’t set up properly … these things can look bad.”
“I think there’s an overcapacity of LED stages now because everyone wants them, but they haven’t become comfortable enough to use them,” Grossmann says. “Longer term, I think the industry is going to bridge the gap between understanding and budget considerations so that there’s a higher percentage of utilization.”
Requesting anonymity, one source put it bluntly, “Some [shows] have been super successful; others were a bloodbath because people were unprepared.”
For early adopters, LED stages can help control the costs, schedules and complexities of production — compared with traveling to locations — while providing a sandbox for creative experimentation. But overuse of the technology can have drawbacks, and understanding how and when to apply it is critical.
The Lucasfilm series The Book of Boba Fett and Obi-Wan Kenobi, both of which employed extensive volume technology, drew criticism in some quarters of the fan community for overreliance on the technology, unlike the recent Star Wars show Andor, which shot mostly on location.
Holben points out, “There’s still a lot of value in being on location and being in the environment. There’s nothing like being in the real canyon or going to Ireland and being in a real castle. When you are on practical location, there’s a lot of discovery.”
Cinematographer Greig Fraser — who won an Oscar this year for Denis Villeneuve’s Dune and used LED stages for season one of The Mandalorian and a portion of The Batman — says virtual stages “do not do midday or daytime sunlight very well” but cites dawn and dusk scenes in The Batman as examples of when these stages are advantageous.
From a creative standpoint, he says, “If you’ve got something happening sort of in a dawn or dusk environment — what we did in The Batman in the construction site overlooking Gotham — then it works really well because you’re dealing with soft light. Particularly for The Batman, it was very good because they were long scenes and normally, if you want to shoot something at dawn or dusk, you’ve really only got that short window of time.”
“There’s a tendency to think that [an LED] volume solves all the logistical issues that come with shooting on location,” Fraser adds. “The danger when people don’t quite understand what it’s good for and what it’s not good for is that they can tend to put things on the volume that shouldn’t be on the volume. And when you watch it, it’s not quite right, which can give virtual shooting a bit of a bad name.”
Sources emphasize that virtual production has to start with early planning — considering factors such as schedule, budget and creative — and involve the various departments, including the virtual art department. “We provide cost analysis,” says Janet Lewin, senior vp, Lucasfilm Visual Effects, and general manager, ILM. “That has been a challenging metric for people to arrive at on their own.”
To help forward the understanding and use of these techniques, Epic Games created what it calls its Unreal Fellowship, a 30-day course in virtual production that since its launch roughly two years ago has trained an estimated 2,000 professionals. Participants apply for a seat, and those who are accepted are paid $10,000 by Epic to complete the course. Epic’s training efforts also involved partnerships with the American Society of Cinematographers and Art Directors Guild.
While many use the term “virtual production” synonymously with an LED stage, the former has a broader meaning that can incorporate areas such as previsualization and performance capture. High-profile steps include the production of Favreau’s The Lion King, which allowed the filmmakers to explore and experiment in the CG African locations by wearing virtual reality goggles. Many trace the start of what is today considered virtual production to the making of James Cameron’s 2009 Avatar.
Aiming to help filmmakers use a common vocabulary, earlier this year the Visual Effects Society introduced an online virtual production glossary. Here, virtual production is defined as a technique that “uses technology to join the digital world with the physical world in real-time. It enables filmmakers to interact with the digital process in the same ways they interact with live-action production.”
That definition seems to provide clues as to what the future of entertainment might look like. “We see virtual production as a bridge to help film and TV content get produced in a way that makes it more amenable to the metaverse or immersive experiences, and putting audiences inside of those experiences,” says Magnopus’ Grossmann, noting: “If we can put filmmakers and a film crew on a set surrounded by an LED wall, then we can take the content and give it the audience in their homes through a VR headset.”
Adds Epic Games’ Perkins, “Once a team has iterated and finalized an asset, they can use it across mediums – linear content, experiential content, games, live events, and beyond. With a real-time game engine like Unreal, everything is so easily transportable that there’s no longer a distinction between the needs of a linear deliverable versus an experiential deliverable. This means that virtual production is inherently preparing us for a new era of entertainment.”
James Hibberd and Alex Ritman contributed reporting.
A version of this story first appeared in the Oct. 19 issue of The Hollywood Reporter magazine. Click here to subscribe.