While ray tracing has been adopted to make gaming more real, the technology took live broadcasting to another level at the opening of The League of Legends Pro League regional finals in Shanghai. The live show host interacted with an augmented reality gaming character in a real-time mixed reality broadcast.The character replied to interview questions in real time and performed with other dancers on stage.
To achieve this level of realism, Riot Games used The Future Group’s Pixotope mixed-reality virtual production software, Cubic Motion for real-time facial animation, Animatrik for managing motion capture, and Stype for camera tracking.
The combination creates real-time photorealistic graphics and visual effects on live television — and takes immersive mixed reality to the next level.
Working with live production used to entail maintaining standard video frame rates for broadcast while using compute-intensive ray tracing. However, the RTX-powered Pixotope software makes it possible for studios to take advantage of real-time ray tracing.
With Pixotope, Riot Games and The Future Group were able to tap into the NVIDIA Turing architecture’s RT Cores for ray tracing, Tensor Cores for denoising, and CUDA cores for shader computing to create new levels of realism with realistic details — from the different areas of lighting on stage to soft shadow effects.
RTX-powered ray tracing, along with real-time facial animation, helped integrate the animated character into the broadcast and have its movements and reactions match the presenters and dancers on stage.