But, like your typical screen hero, it might just be the metaverse to the rescue. Let us explain.
What is the metaverse, why it is important?
Mark Zuckerberg, CEO of Meta (formally Facebook) presents the metaverse as the future of human interaction where
the alignment of virtual and augmented realities (VR and AR) allows us to work, rest and play via a second virtual life that can be accessed by a screen or overlaid (via special glasses) onto the real world.
But how does this help us make our favourite screen content during a pandemic? Or the next global emergency?
What we can do now
Traditional production relies on cast and crew being in the same location at the same time. The past two years have shown there is a strong need to be able to either shoot films where the cast/crew are in separate locations, or where the production space is partly or wholly in a virtual space (such as The Lion King remake).
What we can do now, even with a nascent metaverse, is significant. The current tools of the trade include technology such as deepfakes that uses machine learning techniques to seamlessly stitch anyone in the world into a video or photo and production computer programs (such as Unreal Engine) that create locations and avatars.
Disney studio The Volume, home to The Mandalorian, uses this latest technology to brilliant effect. In The Mandalorian, high-definition digital screens are attached to the walls and roof, providing background, perfect perspective and light, using a mixture of real and wholly computer-generated imagery.
Working with the caveat that money is no object, here’s how these technologies can currently be deployed when tackling the two most pressing production problems in a post-COVID world.
Problem 1: the director in one location, the cast and crew in another
If this was The Lion King remake, director John Favreau could just access the virtual environment remotely using his VR device from his home media room. For other productions, the director can interact with the actor via AR glasses the actor puts on between takes to make it seamlessly appear the director is in the room.
In this way the function of the media room evolves, becoming a home communications hub with an array of cameras and displays. This is already happening and is something big tech is looking to accelerate. Products such as Microsoft’s Mesh for Teams are being rapidly rolled out, where mixed-reality allows for three-dimensional holographic interaction for meetings and collaboration.
Problem 2: the director, star and co-star all in different locations
As of today, we can:
(a) film each actor separately with different crews in front of a green screen, and then match the backgrounds (but the actors will have no interaction).
(b) use AR glasses for the actors to see each other, then digitally remove them as Justice League did with Henry Cavill’s moustache.
(c) use two human stand-ins and use deepfake technology to modify their faces. This is useful if the actors need to touch.
However, all have drawbacks – or, in fact, the same drawback. The actor.
Until we can perfect both the realism of the person and the performance, (just look at the brilliant but not-quite-good-enough Mark-Hamill-less Luke Skywalker in The Book of Boba Fett) the Metaverse will never quite fulfil its potential as a true alternative environment for screen production.
The latest iteration of the young Luke Skywalker was generated from a combination of physical actor (not Mark Hamill) and deepfake technology. It looked physically perfect, but not when “Luke” began talking. This necessitated most of the dialogue to be spoken off-camera. There was also a strong sense of the uncanny valley about the performance, originally named for the negative emotional response towards robots that seem ‘almost’ human.
Forward to the future
The day of perfect human avatars could be coming very soon. It was foreseen by novelist/futurist Michael Crichton – not in Westworld or Jurassic Park, but his obscure 1981 film Looker. The story concerns technology that scans and animates actors, allowing them to retire and simply manage their image rights.
In this proposed near future, COVID is not a concern, nor the death of an actor during production. All films can be made like The Lion King, in a virtual environment.
Actors will remote-in from their media rooms to control their avatars, or perhaps not. In the future, Mark Hamill can have two prices: one where he turns up, another where just his digital twin is used, one that can procedurally generate his performance by watching all of his films to work out what acting choices to mimic.
Just because we can, should we?
History shows us new technology is not usually taken up wholesale and old technology never completely dies. Think vinyl. What is more probable is a certain reverse snobbery. Many shows will fully use the metaverse, enabling them to keep shooting despite real-world calamities.
Perhaps a whole new hybrid genre will be formed. Films that might have once been animation can now be photorealistic – call them “live-animations”.
But in a future where most of us will be eating meat grown in a laboratory, only the top restaurants will still be using living animals. The same is likely for screen production: the ultimate prestige picture will be made old-school, real actors really acting against each other in real environments, pandemics and the metaverse be damned.