FEED Issue 03

45 THE LIVE LIFE AvatarLabs

LIONSGATE LIKES TO TAKE RISKS, AND SO DO WE the first time the software would be used to create a live event around feature film content. Character Animator has a simple but highly configurable motion capture feature

to make sure that specific character poses, mouth positions, head movements were authentic to the final feature animation. DHX also supplied short custom animations, which could be keyed during the streaming event. Real-time character interaction was not the only element used to create the illusion of a real-time animated reality. Another was the ability to choose multiple camera angles of the live animated action. “We wanted to it to feel like a live cartoon,” explains James Safechuck, AvatarLabs’ director of innovation and technology. “We wanted to have different virtual cameras and be able to cut around the scene, and camera movements with nice, fluid visuals.” The team chose to bring the Adobe Character Animator feed into the Unity 3D game engine, which Avatar has used in the

which captures motion data through something as simple as a webcam and allows for puppetry of an illustrated character in real time. The technology allows a flexibility and opportunity for live characterisation which previously would have required a huge team and many coding hours. TACKLING THE PONIES Lionsgate introduced AvatarLabs to My Little Pony animation studio DHX Media. The team were given access to a library of character assets used for the feature film and collaborated with the DHX animation teams

past for projects involving mobile games, AR and VR. Unity had recently released a camera system that allowed for creating in-game cameras and cinematics. This would allow the animated characters to inhabit a 3D virtual space which could be captured from different points of view on the fly. Substantial technical rehearsals were run, with Adobe providing support throughout. Adobe technicians helped the AvatarLabs team tweak their delivery workflow to match

Powered by