Real-time technology is quickly becoming one of the most popular technologies to have been developed. Real-time rendering has already been used in several virtual productions mainly because of the instant and fast feedback from cast and crew. Some successful projects include The Lion King and The Mandalorian but small and big companies are making use of the technology. Here are some ways that companies have invested in real-time and how their tools and techniques are impacting the industry.
Epic Games is one of the frontrunners of real-time technology and a recent demo showed how these technologies come together to enable in-camera VFX during a live-action shoot. It featured an actor on a motorbike filmed live-action against a series of LED wall panels. The images produced on the wall blended with the live-action set and it could be altered. Filmmakers prefer this type of shooting because it is more feasible and allows them to make the green screen environment lighting more believable. It also gives them more predictability because they can alter the scene and change locations as often as they want without removing the actor or props from their current environment.
The Render Pipeline
Unity Technologies is another company that uses real-time rendering by developing its Scriptable Render Pipeline. It allows gaming developers to adjust and optimize the approach to rendering inside the engine. By using the Pipeline, developers choose which algorithms to use and they can take advantage of pre-built templates and solutions. They also can create a customized render pipeline from scratch. It gives the developer a better sense of control.
Real-time Ray Tracing from traditional renders
Real-time rendering is not only aimed at modern development companies. Many traditional companies have built offline renderers. Chaos Group, for example, is busy developing its own real-time ray tracing application called Project Lavina. It will allow users to explore scenes from its V-Ray in a ray-traced environment in real-time. If a developer would want to work with a 3D application, for instance, the scene is ray-traced all the time which eliminates toggling back and forth between frames.
Virtual production tools
Glassbox is one company that offers contemporary solutions for filmmakers that think out of the box. Their tools, DragonFly and BeeHive, allow users to visualize virtual performances on a virtual set and collaboration between the virtual art department and the virtual set is more efficient. Even though these technologies have existed before, these tools lend a functionality that has never been experienced.
iClone by Reallusion contains a Motion Live plugin that allows the user to capture full-body real-time performance capture and assists with character animation. The characters and data are transferred along with the animation, camera, and lights onto the Unreal Engine platform.
Real-time usually focusses on creating non-human characters and altering environments but it can also be used to create and enhance CGI humans. Digital Domain created its studio where this technology is produced. The company can project avatars in real-time and its machine-learning techniques make it possible to add facial expressions to the avatar. All this has to be done within 1/6th of a second which is why the technology is so revolutionary.