FEED Winter 2020/21 Web

FIERY DISPLAY Usually quite the spectacle, Riot and its partners had to adapt to replicate the event’s same stunning atmosphere for fans virtually

While the event was reduced in scope, it did not disappoint, thanks to stunning extended reality sets. Behind the scenes, Riot added further to its already well-virtualised production model, so staff based globally worked from the safety of their own homes. FEED spoke to the esports tech lead at Riot Games, Scott Adametz. His nine-strong team, based in LA, is responsible for the backend infrastructure, which supports big events like Worlds. “Technically this show is the most ambitious that we’ve ever done,” says Adametz. “In the middle of a pandemic, when we probably should have been downsizing, we thought bigger about how we could make the show better, despite limitations.” These ambitions were visually apparent in the virtual sets. Riot’s team used extended reality to merge physical and virtual worlds, hosting early stage matches against the backdrop of a cyberpunk Shanghai skyline, or amid a flooded landscape.

One company responsible for the sets was Super Bowl half-time show regulars, Possible Productions. “Those guys really understand spectacle,” says Adametz. The other was Lux Machina, the in-camera and virtual production display company behind The Mandalorian’s sci-fi landscapes. More than 900 LED tiles were used for the sets in total, displaying visuals at 32K resolution and 60fps. These visuals were rendered using a modified version of Unreal’s gaming engine. Rather than sticking to a single point of view of the virtual environments around the players, there were two perspectives running simultaneously all the time, enabling the broadcast team – using a four- camera shoot – to swap between different points of view and have the background environments stay coherent. According to Adametz, Worlds is one of the first productions to attempt to do this in a live broadcast environment. The biggest challenge this presents is timing and sync. “It’s easy to achieve in post, but we did this live with four simultaneous cameras. That’s another area where we had to take the next leap. How can you have an LED wall and a virtual extension stay in sync and not switch between cameras, while still seeing the view and perspective of the last camera?” According to Wyatt Bartel, Lux Machina’s senior technical director, code developed by the teams at Riot, Possible and Lux MC provided a solution to this problem.



Powered by