DEFINITION May 2019

SET- UP | I NTERV I EW

IMAGES Arraiy’s development team showing keying without green scene

On some levels even if you can do it at

a cheaper rate, it’s a solved problem.

So I spent some time just consulting with them to narrow in to what they were doing and where the real value proposition was. The question was: how can we leverage their AI ideas for real-time content creation? DEF: So you changed your focus away from hardware development? MT: Yes, we made a shift still using the same underlying technology, but then looked at how we could enable real time and really empower content creators. So first thing we did was to focus on tracking and now we have a tracking solution called DeepTrack – that’s what we’re releasing [during NAB Show]. It’s a through-the- lens monocular tracking solution, which is entirely 100% software based, just understanding the features and the textures of a scene. We calibrate the camera, which takes about 20 seconds, and we do some deep learning on the environment, whether that’s in the studio or in a sports environment outdoors. We can create a model over three to four hours, depending what the scene is, and then you have a known geometry of every scene. You can then enable any camera that’s in that scene to camera track and to object track. So if NBC wants to shoot a football game or an athletics event and wants to use a 100 cameras, we can enable all 100 cameras to do tracking, which enables them to do real-time graphics. That’s the differentiator; we don’t need hardware, we don’t need stickers on the ceiling, we create a neural network that we can leverage across any cameras or scenes.

DEF: What does Arraiy offer the customer?

So if NBC wants to shoot a football game with 100 cameras, we can enable all 100 cameras to do tracking that by the end of the year. Next year will be soft-object tracking, so we can do motion capture potentially without all of the sensors and all of the hardware around that. DEF: Tell me a little about Arraiy? What kind of capital have you raised? MT: The company is based in Mountain View, so it’s really a Silicon Valley company – it’s a venture-backed company. Last year, we announced a $10 million Series A round of funding led by Lux Capital and SoftBank Ventures, with participation from Dentsu Ventures and Cherry Tree Investments, and continued participation from IDG Capital and CRCM Ventures. DEF: Who has shown an initial interest in these products? MT: We’ve been talking with MPC and The Mill who are both Technicolor companies and working on how we can integrate into their virtual production. We don’t want to reinvent the whole pipeline, they have pipelines and embedded solutions, so we just want to be able to offer them a licence for our software. We’ve also been having conversations with companies like Avid. All these companies have great products and services, and we just want to enable them. I say that we are really bringing the physical world to the digital world and there are some people who are doing this with hardware-based solutions and we’re doing it 100% in software.

MT: We basically allow anyone to develop their own neural network for their own uses. At NAB Show 2019, we partnered with The Future Group to show our tracking solution that many people are interested in. They have a great rendering graphics engine, so you can use MoSys or any other system. But if a customer wanted a fully integrated software-based solution, they can buy Pixotope virtual reality system with the Arraiy-embedded tracking system. We’d then license our tracking system into Pixotope. DEF: What’s in development and how else can AI help in this field? MT: The next thing we’re developing is a segmentation rotoscoping solution, which essentially allows you to do Ultimatte-like green screen, but we can do it without a green screen – any type of flat field or brick wall or anything with depth matting as well. We’re looking to release

12 DEF I N I T ION | MAY 20 1 9

Powered by