In greenlighting The Mandalorian, Disney+ posed a question of near-existential proportions:
How do you fit a galaxy far, far away onto TV screens just a few feet wide?
Tasked with producing the first-ever live-action television series in the history of Star Wars, The Mandalorian creator Jon Favreau and his team set out to find a way to fit one of the biggest theatrical franchises of all time into the tight constraints of a TV production. In the process, they discovered a blend of technologies that are, this very second, quietly revolutionizing how movies everywhere are made.
In this post, we’re breaking down The Mandalorian’s twin magics of virtual production and volume technology, as we investigate how one space western is pushing the entire film industry towards a new frontier.
^ Allow the team from The Mandalorian to introduce their revolutionary system.
The Uncanny Valley exists on no map.
You can’t physically go there, but you have probably been there.
You can’t see it, but your body almost certainly knows it well.
That’s because the Uncanny Valley is not a place, but an experience. Coined in 1970 by Japanese roboticist Masahiro Mori, the valley metaphor refers to how an artificial (or digital) figure with human behavior may seem more familiar to viewers initially, but only will to a certain point----the viewer’s sense of familiarity will suddenly drop into this “uncanny valley” as soon as the digital or artificial creation fails to imitate a real human.
This phenomenon describes the mental uneasiness and psychic skirmish between the states of human perception and digital representation. It’s the queasy tension between real and unreal that flares up each time we see a human that isn’t quite human, a robot that isn’t quite a robot, or a pack of animated alley cats that look… umm… uncanny.
Up until now, the walls of the Uncanny Valley have been set by the boundaries of technology itself. We simply haven’t had the power or know-how to render the digital with the same reality as the physical, and the resulting discord makes many people feel literally ill.
But what if I told you that was all changing?
What if I told you that the minds behind The Mandalorian have charted a path out of the Uncanny Valley and into a territory that’s brand spankin’ new?
Welcome, reader, to the Valley of Is.
In the Valley of Is, the real and unreal come together to form a perfect, seamless union. Digital effects are directly integrated into practical filmmaking to craft illusions so convincing that even the filmmakers– those professional sorcerers who have themselves conjured the Valley’s magic- may be fooled by their own spell.
In the Valley of Is, actors no longer play pretend on endless green screen voids whose invisible contents are flatly lit and filled only by description. Instead, the void is replaced by a visceral embodiment of pure imagination, where directors can summon entire worlds from thin air, production designers can craft sets with their minds, and directors of photography can warp light from day to night with little more than a keystroke.
In the Valley of Is, rumors stir of actors being so immersed in their story’s universe that they’ve walked straight into the high-resolution screens supporting it, literally losing track of the thin line between fantasy and reality.
However, as with any good enchantment, the secret of the Valley of Is can be deciphered with just a little understanding. Well, that…
…And maybe a few lifetimes of scientific research.
The world of The Mandalorian was crafted through the apotheosis of technologies that have been evolving for more than a half-century each. We can divide them into two broad categories.
One category opens a gate between the digital world and the physical world, allowing the two to mingle like never before.
And the other, whose roots can be traced all the way back to the earliest days of cinema, transforms the contours of reality itself.
Let’s start with the former.
Hyperreality is any state in which a human consciousness can no longer differentiate the real from a simulation of the real. It’s the place where fact and fiction are so seamlessly fused that it’s impossible to locate the line between the two.
In the big picture of postmodern philosophy, the hyperreal is a contentious subject, perhaps understandably so with its implied consequence of humans, you know, losing contact with all real experience and whatnot.
But what about when it comes to movies?
Hyperreality is basically the goal.
From the brothers Lumière to the sisters Wachowski, filmmakers have been convincing audiences to lose their grip on reality since the birth of the moving picture. The entire history of the silver screen could arguably be charted by the evolution of techniques to draw viewers in with greater speed, efficiency, surprise, and intensity, which begs an interesting question for all you futurists out there:
What’s the next step?
I’m glad you asked. They’re calling it virtual production.
^ The Unreal Engine is an industry leader in virtual production.
As described in the first volume of the Epic Games Virtual Production Field Guide, virtual production is “a broad term referring to a spectrum of computer-aided production and visualization filmmaking methods.”
In other words, virtual production is not one tool; it’s a conglomeration of potentially thousands of tools, each of which has been designed to bring the worlds of physical reality and digital representation closer together.
And while that may sound like science fiction, the fact of the matter is that you’re probably familiar with several of these tools already. Virtual production is a climactic accumulation of technologies that have been developing organically for decades.
Just think of the Super Bowl.
No, not the commercials (for once), but the visual experience of the game itself.
Back in October 22, 1939, when NBC broadcasted a professional football game for the very first time, the technology in-use wasn’t even sophisticated enough to maintain a particular camera exposure, much less to keep track of first down lines or break down instant replays. Indeed, the early days of televised sports-viewing provided an experience whose newfound sense of convenience was only matched by an old-fashioned sense of confusion.
High price tags, audience traditions, and technical inadequacy kept the desire to reduce such confusion at bay for over fifty years, until an advertising crisis led ABC and ESPN to premier an on-screen score box at the 1994 World Cup.
This simple score box was itself the key to a digital Pandora’s box. It unleashed a wave of on-screen graphics that would forever change the experience and culture of sports everywhere.
Now, watching an NFL game is about as hyperreal as it gets.
Behind the scenes, the visual experience of a modern NFL game is a complex ballet of 3D modeling, specialty cameras, customized chroma-keying, live compositing, and the rendering power of a central computer. For the viewers at home, the result is a digital playing field where score boxes, yellow down lines, blue scrimmage lines, stats, and other graphics are seamlessly integrated into the real-world elements of the game.
And the prevalence of this reality-bending soft and hardware extends well beyond fields of play. From Google Maps to Pokémon GO, the digital has never had more presence in the physical world than it does at this very moment.
And that brings us back to virtual production.
Virtual production collects, develops, modifies, and redirects these innovations toward use in the film and entertainment industries. Many of the individual tools in the virtual production kit have become popular in their own right. You’ve likely heard of at least one of the following:
Today, with increases in computing power, these tools and others are being combined in new and exciting ways. Virtual production is bringing more precision to pre-production, saving more time in production, and better integrating post-production practices into the overall filmmaking process.
Jon Favreau, for example, made innovative use of virtual production techniques on his 2019 reimagining of The Lion King. The director and his team utilized game engine technology to create an immersive virtual reality space within which they could utilize live-action filmmaking techniques. The resulting versions of Simba, Scar, and Pride Rock were all imbued with a unique, near-photorealistic presence.
In producing The Mandalorian, Favreau wanted to use and improve upon his previous virtual production experience with a slightly different goal in mind. This time, rather than bringing live-action techniques into a virtual reality space, he wanted to bring virtual reality techniques directly into a live-action shooting environment.
To accomplish that goal, however, The Mandalorian would need to add an additional, highly ambitious secret ingredient…
The Mandalorian opens on the ominous tundra of an ice planet, snow whipping through the show’s anamorphic aspect ratio as a certain bounty hunter strides toward the sci-fi saloon awaiting him in the distance. It’s a visceral, immersive scene.
…But it wasn’t shot on location.
I mean, of course not, right? You don’t have to be George Lucas to figure out why a TV show wouldn’t want to shoot in the Arctic Circle. Hearing the phrase “ice planet,” most filmmakers will instead think of green screens, mo-cap rigs, and the process lovingly known as “fixing it all in post.”
But you know what they probably don’t think of?
Walls.
Massive LED walls, to be exact.
The final key to unlocking The Mandalorian’s aesthetic identity is an array of LED walls known as The Volume.
^ Check out this introduction to working with the Volume.
Developed by the wizards at Industrial Light & Magic, the Volume replaces the on-set presence of traditional green screen with live, photorealistic backgrounds, enabling filmmakers to experience the tactile control of live-action production and the imaginative freedom of virtual production simultaneously.
Despite its state-of-the-art status, the Volume is arguably a return to old-school effects. In fact, its basic concept is perhaps most easily explained through the example of its predecessor: rear projection.
Rear projection is a technique in which a pre-recorded image is projected into a shot’s background while an actor or object performs live in the shot’s foreground. The result is an in-camera effect that marries the two disparate elements into a single image as if the background and foreground were actually shot in the same place and time.
Rear projection in driving sequences was a common sight for fans of mid-century film, but the technique could just as easily be put to more imaginative use. Hitchcock employed it for the iconic crop duster stunt in North by Northwest. Kubrick drew on a version of it to expand the scope of 2001. Ray Harryhausen even used rear projection to defy the scale of both size and time with stop-motion hybrid films like Clash of the Titans.
Though it was a technical marvel in its time, rear projection came with plenty of drawbacks. Using the effect limited shot selection, made dramatic lighting difficult, and rarely achieved the perfect illusion for which most filmmakers strive. As superior chroma keying methods came of age, rear projection began to look antiquated.
…Until recently.
Volume technology operates on the same basic principle as rear projection. It displays a pre-recorded image or video onto a screen in the background, while live-action elements are filmed in the foreground to achieve an in-camera composition of physical and digital components.
The difference between rear projection and the Volume, however, is that the Volume turns this whole concept up to eleven.
The Volume’s effect is achieved not with a single projection screen but with a wraparound array of photorealistic LED screens that provides a nearly 360° range of vision. This affords filmmakers not only increased freedom in shot selection but also a fully immersive working experience for cast and crew alike.
In case you were wondering, the Volume improves upon the green screen too.
Because the LED screens found in the Volume emit light, they can be used as light sources themselves. That may sound like a no-brainer but consider the consequences.
With green screen, reflections and the spill of bounced green light can create major headaches in post-production. With the Volume, the risk of spill is zero, and any reflections of the background will be realistic.
Plus, the Volume is fully customizable.
The Volume’s imagery and settings can be swapped or altered with just a few computer commands. Inside the Volume, directors of photography are no longer bound by the flat lighting necessitated by traditional green screen compositing, and production designers are directly involved in the VFX process. With a keystroke, DPs can warp the Volume from day to night. With a click, PDs can teleport the Volume’s occupants anywhere in the universe.
On The Mandalorian, Jon Favreau and his team combined Volume LED technology with their virtual production toolkit to craft a galaxy far, far away on a TV budget and schedule. They were able to leverage the Volume’s ease of use to shoot at faster speeds than formerly possible and its depth to create a level of immersion worthy of the Star Wars name.
^ Using volume technology on The Mandalorian was a calculated decision.
With its blockbuster debut in 2019, The Mandalorian is now widely known as an unmitigated success.
The technology behind it, however, remains largely proprietary. Its mysteries are little-known outside of a small group of practitioners.
…For now.
Beyond the thrill of technological virtuosity, the primary attraction of both virtual production and LED volume technologies currently lies in their uncertain futures. While a great deal is known about the capabilities of these tools, it’s everything we don’t know that makes them truly compelling.
Like synchronized sound and Technicolor before them, virtual production and LED volume technology are today reshaping the borders of filmmaking as a process. Their very existence is likely to change how movies are shot, designed, and organized.
There’s only one question: How?
Let’s talk about 5 conceptual areas where virtual production and LED volume technology might make a surprising impact in the not-so-distant future.
Between instant lighting changes and rapid set transitions, it’s clear that the Volume’s circle of LED walls has the potential to save time.
But how much? And where?
First-hand experience is essential in designing any shoot’s schedule, but first-hand experience with the Volume is relatively hard to come by. Best practices and reasonable expectations are still a long way from being set in stone.
We know that both volume technology and virtual production tools are built to ease the post-production workload and will almost certainly do so, meaning that shorter post- and VFX timelines may now be on the table.
We also know that The Mandalorian was able to shoot much faster than initially scheduled, a fact that stands as strong but anecdotal evidence of the Volume’s efficacy during production.
What we don’t know, however, is how any of this might change as virtual production and volume technology reach widespread, mainstream use. Is it possible, for instance, that customization functionality might actually bog some directors down?
And even if we assume that significant time will indeed be saved, where exactly will it impact the schedule? With IATSE’s recent push for more reasonable work hours, could volume technology help facilitate the film industry’s transition away from 12-, 14-, and 16-hour workdays? Or will budget-conscious producers simply slim down an average production’s overall shoot duration?
Pardon me for this one, but…
Only time will tell.
The schedule is the X factor within a shoot’s budget. If time is money and volume technology requires less time, then utilizing volume technology could trim mega-bucks from a production’s budget.
…Right?
Maybe.
At the moment, the impact virtual production and volume technology will have on production budgets is difficult to pin down. Beyond the schedule, these tools carry a host of potential costs (and cost savings) that may vary from project to project.
LED volume technology, for example, requires a huge initial investment just to get the equipment up and running, not to mention the significant costs of overhead and maintenance associated with such a facility’s upkeep.
In the long run, budgets will likely reap the benefits of virtual production and LED volume technology. The cost of volume stages will naturally decrease, while our collective understanding of cost-effective virtual production practices will naturally increase, leading to a better bottom line.
For now, however, these technologies may not make dollars (or sense) for every production.
A modern film crew is a finely tuned unit in which each member has a clearly defined role.
But what happens when you introduce revolutionary technology into the mix?
Shooting on the Volume doesn’t just add a position or two to your roster; instead, it creates a wave that ripples through your crew’s entire structure.
The exact composition of an ideal volume crew is still up for debate, but there are a few observations worth making in the meantime.
Shockingly, shooting with volume technology does not seem to be reducing overall crew sizes. In fact, crew sizes may be growing, with the addition of additional camera and art personnel filling new roles.
However, shooting with volume technology and virtual production tools does shift the crew’s composition to meet the demand for a wider range of expertise. The art department on a volume shoot needs personnel comfortable working in a digital space, while the lighting department may need to add Unreal Engine artists or even coders to their mix.
This added reliance on digital manipulation also raises the possibility of remote collaboration in a way that’s never been achieved before, a small detail that could potentially send a second wave of transformation through the way that crew members experience work.
Virtual production and LED volume technology were both conceived to meet the needs of filmmakers in specific situations. Now that they’ve made their successful debut, what new tools will be birthed to meet the new needs of these kinds of productions?
On a practical level, the proliferation of LED volume technology will likely impact the design of stage spaces themselves. LED volume stages carry different requirements for height, width, and power supply compared to other sound stages. Therefore, studios constructing facilities of this type have an opportunity to innovate upon the physical footprint of stage spaces in general.
Of course, there’s also the technology used within the volume stage to consider.
What new tools will we soon face that seek to perfect the experience of the Volume? Can machine vision lenses be adapted to push live-compositing even farther? Will LED screens be tweaked to provide a full range of light intensity to a camera?
It’s difficult to predict exactly what’s coming next, but we can examine the current technology’s weakness to project an educated guess.
Finally, perhaps the most intriguing unknown about virtual production and LED volume technology is the way they seem to be bringing the three phases of production out of their individual silos and much, much closer together.
Because working with the Volume combines digital and physical elements in-camera, the VFX work traditionally left to post-production has to begin development at a far earlier point in the production process. This simple change of procedure essentially dissolves the walls between each of the phases. Post-production concerns are now integrated directly into the pre-production period as central issues for the whole production.
The result is a more balanced project lifecycle whose benefits or consequences are yet to be fully perceived. In theory, crews will enjoy more creative freedom during principal photography because the restrictive elements of VFX are now rendered live in a form that creatives can react to in-person on the shoot day.
And this spirit of enhanced collaboration may extend to the crew as well. Because everything is coming together in a single moment, working on the Volume calls for nearly every crew member to be more deeply involved in the technical process as a storyteller.
Ultimately, that’s what it’s all about.
When we asked Evan Pesses about his thoughts on virtual production and LED volume technology, the veteran DP and Head of Advanced Imaging at The Astronauts Guild honed in on two critical concepts:
1. “We’re really early on this.”
2. “The tools won’t tell you how to do it.”
With two sentences, Evan cut to the core of these technologies in their current iterations and pointed out a course of action for filmmakers everywhere.
Virtual production and volume technology are mind-blowing technologies, but their survival will take more than technical prowess alone. In order to reach their full potential, filmmakers- not engineers- must be the ones to push these tools beyond their identities as spectacles and novelty items. They must find a way to make virtual production and LED volume technology disappear into the realm of pure storytelling.
The Mandalorian mapped the territory of this new frontier. Now, however, it’s up to creators to forge their own paths forward, into the Valley of Is.
The current state of virtual production and LED volume technology are summed up well with an oft-repeated quote attributed to William Gibson:
“The future is already here. It’s just not evenly distributed yet.”
However, given the particulars, I think the advice of a very different sage might be more appropriate here. So, in the immortal words of Lucille Bluth:
“Go see a Star War.”
If you’re interested in learning more about virtual production or LED volume technology, be sure to check out both volume 1 and volume 2 of The Virtual Production Field Guide from Epic Games.
At Wrapbook, we pride ourselves on providing outstanding free resources to producers and their crews, but this post is for informational purposes only as of the date above. The content on our website is not intended to provide and should not be relied on for legal, accounting, or tax advice. You should consult with your own legal, accounting, or tax advisors to determine how this general information may apply to your specific circumstances.