VFX Time to bid adieu to green screens? MIPCOM 2020 delves deep into virtual production -

Time to bid adieu to green screens? MIPCOM 2020 delves deep into virtual production

Sony Exhibit Next Generation In-Camera VFX Workflow With Mo-Sys

The future of filmmaking is heading for a massive paradigm shift. Over a year ago we had observed whether green screens hold the potential of erasing the whole prospect of location shooting. 

While discussions have ranged from the acceleration of cloud integration to remote working in the pandemic at international festivals, what has been a recurring theme is virtual production. The exciting part is the iterations of virtual production technologies advanced by various companies, promising a nuanced way of filmmaking and easing the pandemic woes of the production industry. 

Allowing high quality shows to be produced with minimum crew and talent on the sets, virtual production has emerged as the panacea for the filmmaking woes. 

Sony Pictures Entertainment (SPE) executive vice president and Sony innovation Studio chief technology officer general manager Bill Baggelaar

Sony Innovation Studios is focusing on bringing new technologies to production in a big way. Sony Pictures Entertainment (SPE) executive vice president and Sony innovation Studio chief technology officer general manager Bill Baggelaar recently informed at a MIPCOM 2020 session titled The Future Of Movie Magic: Real-Time Volumetric Virtual Production that they’re working on volumetric captures of real-world locations and film sets which allows them to bring those assets to the controlled environment of a sound stage.

On being asked what virtual production means to him, he describes, “Virtual Production is a bit of an overloaded term which is why we are trying to distinguish and say volumetric virtual production because you know, people say, oh we’ve been doing virtual production for years shooting on green screens. And yes that is in fact true. But I think what virtual production in general is evolving to is having some sort of real time visualisation onset and not just shooting against a green screen.”

According to him volumetric virtual production offers a more unified experience to the filmmakers where they can have any location realistically created and produced on an LED wall without leaving too much on imagination. 
He expounds, “Usually everybody has to envision their own experience with that green screen environment, but being able to to give the actors, directors and the crew a singular experience of what is intended to be experienced on set. Whether that is looking at a monitor and seeing what’s going on or you have an LED wall and you’re seeing the camera feed, I think that virtual production is evolving into that sort of methodology to give people a much more unified experience. I guess that’s really what it would be about is that unification of the experience so that you and I are no longer imagining the same similar thing. We’re seeing the same thing as one of the other major benefits for Content creation.”

He believes that harnessing virtual production is not being seen yet because it would mean a huge disruption and a sea change but the restrictions around shooting traditionally in current times indicate that the growth trajectory will take off heavily going forward.

Speaking about the advancements that have made this possible, he shares,
“Well, certainly real-time game engines. I mean, that’s been at the heart of you know, the ability to visualise in real time, this has been real-time game engines and been used for a while which I called previz
work but heavy graphics computing and the real-time game engines have enabled now to create this sense of actual realism that you could get so that you’re now able to put it up on a screen in real time LED walls.”

Hailing interactive lighting as the crucial aspect, he details, “Because interactive lighting is a big part of what filmmakers are after. It is the ability to not have to put lighting afterwards but get real reflections. So it feels like you’re there and the time and place that they were meant to be.”

LED walls are not without challenges. As it happens, LED walls are expensive to begin with. In addition to that, one needs
to plan their production accordingly and what kind of shots you want to do with virtual production depending on a lot of really technical details. He shares, “A lot of it comes down to the pixel pitch cost which is a huge factor on LED while these are not inexpensive at all. The amount of light that they can put out, their colour accuracy, these are all the factors that come into play and how close you can get the camera due to that pixel pitch is a really big factor that you have to take into consideration. You can’t just move it around as you want because you don’t want it to get too close.”

Erik Caretta - Hive Division
Hive Division Virtual Production Supervisor Erik Caretta

Speaking about the projects in which they have leveraged this technology, he shared, “We have Shark Tank that’s been using that for the past couple of seasons in four segments on their show. We have several other projects that will be coming up in the future. It’s been used on a capture of Men In Black International. Their headquarters was used in some commercials.”

Speaking about Hive Division at a presentation, another player in the VP space, virtual production supervisor Erik Caretta shares, “We first had a taste of the solution last year when we took care of the visual effects of Il talento del Calabrone; a film directed by Giacomo Cimini and produced by Paco Cinematografica. We created the illusion of the shooting actually taking place on top of a Milan’s skyscraper by first shooting the actual footage on location and then projecting the plates onto the filming stage, thanks to an array of video projectors. This experience proved all the benefits of finally getting rid of green screen. Unfortunately, projecting pre-rendered images or videos still leaves you with some limitations. Even a small amount of parallax can quickly dispel the illusion. So for the past year, we have been studying ways of using Unreal Engine to project a background rendered in real time time based on camera position, and using LED walls instead of projectors, in order to achieve enough brightness to potentially reproduce any kind of environment.”

Explaining the process they followed to overcome and navigate the bottlenecks and pull off a successful virtual production, he elaborates, “And then, as it usually happens in our field, ILM did it first with The Mandalorian which applies the very same technology to spectacular results. We had to organise a test on a slightly bigger scale than we did before. So we teamed up with 3P, our partner company who specializes in audiovisual solutions, and we decided to set the scene in the same world of our latest short film La Fiamma. The technology is based on three different phases. Firstly, with a motion capture system you detect the 3D position of the camera on stage. Then you use those 3D coordinates to determine how the environment would look into the field of view of different LED walls as seen from the perspective of the camera. And finally you have those rendered images projected on the LED walls. And all of this must happen faster than the camera can record a single frame. We’re only scraping the surface of what this technology can do.”

With virtual production being emerging strongly on the horizon, the day is not far when producers will be able to conveniently kiss green screens goodbye.