Multicam Live Production in a Virtual Environment

Tom De Wispelaere, Dries Tastenhoye, Vincent Van Werde, Gregg Young, Willem Vermost

Hardly any movie is made without the use of visual effects (VFX). The power of today's graphical processors allows a lot of the effects to be rendered in real-time, which opens the possibility of recording in-camera VFX. This technique has been used in the making of several recent movies. The TV show “The Mandalorian” uses a large active LED wall to project its 3D scenery. This way, the actors, director, and camera operators see the sets in real-time, instead of a greenscreen. Bringing this innovation to the television studio offers several challenges to overcome. The use of a multi-cam setup and synchronous switching of what is displayed on the LED wall or a consistent depth of field to name just two. We have overcome these obstacles with the Ketnet live show “Gouden K's”. Besides the big productions, we believe this technique might be even more beneficial in small productions using a limited technical crew. With a second project “PeetieClub”, VRT wanted to explore the possibilities and limitations of Virtual Studio Production using Game Engine Technology. The software-based solution we've deployed (running on common PC hardware) allows for lots of flexibility, creativity, and high-quality content in real-time. Using Unreal Engine and PTZ cameras, we've built a complete interactive 4-input Virtual Production switcher engine using one main workstation. Both projects will be described including lessons learned from an operational point of view.

Published
2021-11
Content type
Original Research
Keywords
extended reality (XR), live, multicam, game engine, LED wall, character animator, on-set virtual production.
DOI
10.5594/M001932