might call ‘raw brain power.’ It imitates the trial-and-error problem solving our minds perform. Applying machine learning to a task is like asking an infinitely obsessive brain to try everything it can and come back with the results. When these are put before you for inspection, you either say ‘yes’ or ‘no’ – or maybe rate them according to how close they come to your specific requirements. The ML algorithm goes back and tries again, taking your feedback on board. It returns with more results, brings them to you once again for approval – whereupon you say ‘yes’ or ‘no’ a further time. Back and forth it goes, repeating the process until it can get remarkably accurate, useful results, or make uncannily precise predictions. Through this process (oversimplified in this example), we now have speech, text and facial recognition, instant translation in any language, cars that almost always understand the difference between a pedestrian and a street sign, and recommendation engines that know what products we need before we even know they’re needed. In the media industry, AI and machine learning are used extensively, most notoriously in content recommendation. ML is also making unmanageable databases useful and monetisable, able to identify and tag huge libraries in a fraction of the time it would take a human team. But can AI help us be more creative? Can AI actually make better content? AI-POWERED MOCAP Radical AI has a solution that uses AI to allow one person to do what once required a whole crew of Weta Workshop
WATCH THIS VIDEO WITH QR CODE
RADICAL AI PROMO!
technicians – live motion tracking, based only on 2D video input. No ping-pong balls and no leotards. Radical leads with its Core community access, in which the AI is made available at affordable rates, allowing users to upload video via any browser and download FBX animation data. Core has been widely deployed within post-production pipelines by grassroots content makers and developers. The company’s Radical Live product streams live video to Radical cloud servers, where it is processed by proprietary AI. Animation data is then instantly returned for real-time ingestion into apps, games or other software. Live is also being billed as a ‘real-time multiplayer platform’ – in other words, a metaverse
Powered by FlippingBook