Definition Live Spring 2023 - Web

The inaugural issue of Definition Live kicks off with an overview of LED-based extended reality production, which has been accelerating since the pandemic hit. We also explore the latest in technology-rich, social-media-primed immersive experiences in stadia and entertainment venues. For gear aficionados, we dive into the newest and most popular intercom solutions, as well as forecasting the technology from this year’s ISE show in Barcelona.




How experiential events are feeding the appetites of live audiences Totally immersive


04 A UDIO-VISUAL ALCHEMY What to look forward to at the big AV event in Barcelona 11 XR TRAILBLAZERS Explore the runaway success of LED-based extended reality 20 TRANSPORT TO NEW DIMENSIONS  The seductive power of immersive experiences 29 STAYING CONNECTED The newest and most popular intercom solutions





BRIGHT PUBLISHING LTD Bright House 82 High Street Sawston Cambridgeshire CB22 3HJ, UK EDITORIAL EDITORIAL DIRECTOR Roger Payne ACTING EDITOR Robert Shepherd DEPUTY CHIEF SUB EDITOR Matthew Winney SUB EDITORS Harriet Williams, Ben Gawne CONTRIBUTOR Adrian Pennington

ADVERTISING SALES DIRECTOR Sam Scott-Smith 01223 499457 SALES MANAGER Emma Stevens 01223 499462


Definition is published monthly by Bright Publishing Ltd, Bright House, 82 High Street, Sawston, Cambridge CB22 3HJ. No part of this magazine can be used without prior written permission of Bright Publishing Ltd. Definition is a registered trademark of Bright Publishing Ltd. The advertisements published in Definition that have been written, designed or produced by employees of Bright Publishing Ltd remain the copyright of Bright Publishing Ltd and may not be reproduced without the written consent of the publisher. The content of this publication does not necessarily reflect the views of the publisher. Prices quoted in sterling, euros and US dollars are street prices, without tax, where available or converted using the exchange rate on the day the magazine went to press.


DESIGNER AND AD PRODUCTION Man-Wai Wong JUNIOR DESIGNERS Hedzlynn Kamaruzzaman, Kieran Bitten


3. JANUARY 2023


Audio-visual alchemy

Convening the core of the AV industry, ISE 2023 promises tech wonders

WORDS. Adrian Pennington & Robert Shepherd IMAGES. Various



“Many vendors cannot be easily corralled into predefined categories. Their innovations are used in a multitude of vertical markets” AV embraces every sector outside that of M&E – from digital signage in retail and airports, to interactive museum exhibits, personalised hotel and hospitality guest engagement, T he divide between professional AV and broadcast market sectors used to be considerable. So much so that there was deemed little demand for a dedicated show for Europe’s AV integrators until 2004, by which time its broadcast counterpart, IBC, was regularly attracting 40,000 visitors to Amsterdam. The last decade has witnessed spectacular growth on the AV side with attendee numbers to ISE 2023 in Barcelona expected to top 50,000 – though still short of its 2019 81,000 record as the hangover from Covid-19 continues. By contrast, visitor numbers to broadcast kit shows have stagnated, reflecting the growing potential of an AV industry valued by trade body Avixa at $75.5bn in 2023, with growth continuing at 5.65% through to 2027, hitting $91.8bn in EMEA alone. What is striking is that much of this growth can be attributed to a digitisation and IT-ification of audio-visual tech. While digital video and advanced audio formats have democratised the market for media and entertainment, the application of the same technology is far wider in AV.

IN THE LIMELIGHT AV is experiencing a boom, as its uses expand far beyond the broadcast sector

military-grade VR and enterprise-wide unified communications. All of which increasingly rely on video and deploy technology that underpins even the most innovative broadcast programming. The clearest examples lie in live events, many of which are combining cinematic techniques, interactive content and theatrical design to provide visitors a sensory experience of fantasy or photoreal environments with a depth of immersion they cannot enjoy at home. This technology is being incorporated into the very fabric of certain structures to make the venue itself a destination. Crucially, an ability to immerse audiences in a shared live spectacular is something that solitary virtual reality does not yet come close to. Another notable technology convergence is the use of LED screens beyond traditional signage. Virtual production techniques with live-action photography shot on stages ringed by LED backdrops are becoming staple to narrative storytelling, as much as for corporate messaging. Related technologies such as game engines and powerful processors are fuelling augmented and mixed reality experiences. Extended reality (XR) – the catch-all for VR, AR and MR – is playing a fundamental role in shaping our future, from digital learning and museums to CAD design and the digital twins for research and engineering. Much of this convergence is being underpinned by the shift to standards- based AV over IP, where the common goal is interoperability and independence from proprietary technology. Various solutions

across the ISE show floor include SMPTE ST 2110, the SMPTE-backed IPMX, NDI, and on the audio networking side, Dante, AES and Milan. The technologies differentiating AV from broadcast were shunted closer together during the pandemic. Other obvious examples are the application of virtual meeting tools as a broadcast standard, while broadcast workflows are being utilised more frequently in enterprise videoconferencing. The two industries are unlikely to converge completely since the principal applications remain different. An example of this in sports streaming is the advance of ultra-low latency powered by 5G networks and Edge compute to deliver 8K VR, real-time betting and viewer- selectable angles of the match. However, there’s no reason why the majority of broadcast kit is any different to that used in the enterprise. Hence, many vendors cannot be easily corralled into predefined categories. Their innovations are present in a multitude of vertical markets, cross-pollinated by the skills of specialist integrators to deliver new and exciting ways of communicating. Which is why more nominally broadcast tech vendors are finding a home at ISE.

LISTEN AND LEARN There will be a brand-new audio demo suite for the first time at ISE 2023

05. JANUARY 2023



ATOMOS Hall 5 Stand F210

Atomos will be showcasing its suite of cloud-based tools for live production. Using Atomos Cloud Studio, video creatives can remotely and collaboratively produce a live show – all from an iPad. Easily accessible tools include a fully featured four-input video switcher, sound mixer, video effects, graphics, tally, talkback and more. Source inputs can be from a camera- mounted Atomos Connect device – for example any Ninja V or Ninja V+ fitted with an Atomos Connect accessory or the all-in-one Shogun Connect. In addition, Atomos offers a Pro Camera iOS app, a powerful 4K video capture app for filmmakers on Apple iPhone. Program output can be routed to any popular social media platform or RTMP/S destination, in addition to monitoring feeds sent to iOS, MacOS or Apple TV apps.

7THSENSE Hall 5 Stand E250

7thSense is launching a range of products at ISE including R-Series 10 – a 3U server hardware platform designed to meet the needs of the most demanding generative projects and virtual productions. As a solution that streamlines filmmaking workflows, R-Series 10 is available to purchase from 7thSense as a standalone hardware product and comes ready for use with the user’s preferred generative engines including Unreal Engine, Unity, Notch and more. BROMPTON TECHNOLOGY Hall 3 Stand S800 This year’s ISE will see Brompton Technology showcase its newest Tessera 3.4 software, with features like extended bit depth and stacking improving image quality and flexibility for large-scale live events, virtual productions, film and TV. Two key on-camera features, Frame Remapping and ShutterSync, will also be demonstrated on the stand.

ROE VISUAL Hall 3 Stand C850

Bringing its latest innovations to the ISE exhibition, Roe Visual will showcase products and technologies tailored to its activities in vertical markets, such as broadcast, film, live events and fixed installation. Roe Visual will present the Ruby 1.9B V2 LED panel, a high- performance, broadcast-grade HD-LED panel that’s designed for broadcast and virtual production applications and offers cutting-edge LED design and technology. Other showcased products will be the Ruby-C 2.3, Ruby 2.6, Graphite 2.6, Opal and Black Quartz LED solutions.

GHOSTFRAME Hall 3 Stand C850

A fully fledged XR stage set-up will showcase ‘game-changing’ Ghostframe technology. This innovative system will be demonstrated live throughout ISE to highlight its workflow benefits for broadcast and film productions. Recent projects like the Fox NFL Studio and TVN Studio installation use this Ghostframe in combination with Roe Visual Black Pearl LED panels. Members of the Ghostframe team will be at the stand to explain the technology and its many benefits.

“Using Atomos Cloud Studio, you can remotely

and collaboratively produce a live show – all from an iPad”



BIRDDOG Hall 5 Stand C375 BirdDog’s mission is to make live television production easier, more cost- effective and achievable in more places by more people. By enabling live video to travel over standard computer networks present in all modern buildings with no compromise, BirdDog believes ‘it’s time to move live video production to the internet age’.

“Its remote heads and robotics are used on many Hollywood blockbusters and high-end TV shows including Gravity , Life of Pi and Westworld ”

ABSEN Hall 3 Stand N300 Absen returns to ISE to showcase its latest innovations in virtual production, corporate, MICE and live events sectors. Within ISE’s dedicated Multi-Technology Zone at Hall 3 Stand N300, Absen will engage visitors with a dedicated virtual production studio experience and present its latest AV technology. CLEAR-COM Hall 5 Stand G820 Clear-Com will highlight the Arcadia Central Station with Helixnet integration. Arcadia is a scalable IP intercom platform that integrates wired and digital wireless partyline systems along with third-party Dante devices in a single rack unit.

MO-SYS Hall 6 Stand H310

Mo-Sys is a manufacturer of virtual production solutions, training, camera tracking, image robotics and remote production. Its remote heads and robotics are used on many Hollywood blockbusters and high-end TV shows including Gravity , Life of Pi , Westworld , Birdman and Tron: Legacy . Mo-Sys is registered as a co-exhibitor on the Alfalite stand.

07. JANUARY 2023



Since 2012, Brompton Technology has been at the forefront of LED video processing. We look back to see how it became the powerhouse it is today

The company grew steadily, remaining focused on live events until 2018, when new applications using LED screens as virtual backgrounds for film and television production started to emerge. The team saw a need for the same high-quality processing it had been providing for live events. “From the off, we’d prioritised making LED screens look good to both the eye and the camera as most major live events are televised, so this new market was a natural fit for us,” explains Mead. In 2019, over 80% of Brompton’s business was still in live events. “But in 2020, as live events were suppressed, we focused on an area that was accelerated by the pandemic – virtual production and XR.”

Barco, Mead and the team saw the rental companies turning towards the increasing number of decent-quality LED panels being manufactured more cost-effectively in China. But the one area where these products clearly fell short was processing. “The vision from day one was to plug that gap and develop a processing system that could be paired with the panels coming out of Chinese factories,” says Mead. “Brompton’s first customer was VER, one of the world’s biggest AV rental companies, now part of the PRG Group. By 2013, VER was using Brompton LED processing on the biggest live televised shows of the year. The first high-profile project was the Oscars, so we didn’t start small.”

CELEBRATING TEN SUCCESSFUL years in business, Brompton Technology’s revolutionary approach to LED video processing began much earlier than 2012. According to CEO Richard Mead, the founders met while working as a team of designers, software developers and hardware engineers at Flying Pig Systems, the iconic lighting console manufacturer, now part of ETC-owned High End Systems. Following Flying Pig’s purchase at the start of 2004, a group of seven, including Mead, formed Carallon – with a mission to develop high-quality control systems for the entertainment industry. Over the following years, several spin-off companies were born, the second being Brompton Technology, focused on serving the LED video processing needs of the top end of the live events industry. At the time, the world of LED video was in a period of significant change. Having been dominated by big brands like “The first high-profile event to use our technology was the 2013 Oscars – so we didn’t start small”

GO WITH THE FLOW The company weathered losing out on live events in 2020 by developing virtual production workflows



FROM PARIS WITH LOVE Dark Matters’ studio in France offers end-to-end solutions, from Simulcam and in-camera VFX technology to motion capture

ready LED processing. To support these two core sectors, and to apply the company’s technology to new markets, Brompton secured a £5.1m growth capital investment from specialist private client alternative investments business, Connection Capital. As part of its growth trajectory, Brompton recently moved into a new UK HQ in south-west London that includes upgraded demonstration and training areas, allowing customers to better experience the company’s latest offerings. It also has outposts in Shenzhen, Taipei and LA, and a mainland Europe office in Belgium. Brompton now has a 50-strong team and delivers product to over 100 countries. Its LED processing solutions are trusted on tours for some of the world’s biggest artists, most recently Metallica’s The Return Of The European Summer Vacation tour and Ed Sheeran’s Mathematics tour; on film and TV productions such as Westworld , House

Early adopters of LED screens for film didn’t have a smooth ride. LED screens that weren’t designed with on-camera use in mind sometimes turned out to be unsuitable, and the wrong LED solution, or wrong pairing of screen and camera, would cause more problems than it solved. Brompton responded with a wealth of tailored features combined with rock-solid reliability, and worked hard to ensure LED screens became a collaborative partner in the creative process. “This is an ongoing process of working with users to understand the pain points, then developing features in our Tessera software, or integrating with complementary technologies to create streamlined virtual production workflows,” says Mead. With the return of live events, and virtual production still going strong, there’s a rapid increase in demand for Brompton’s cutting-edge tour and virtual production-

of the Dragon and Star Trek: Discovery ; gaming tournaments like the latest League of Legends: Mid-Season Invitational (MSI) and the first Wild Rift Icons Global Championship; and in renowned virtual production and XR studios globally. “We have a passionate and skilled team, a global network of partners and customers who put their trust in Brompton solutions,” concludes Mead. “Our industry-leading Tessera video processing technology, diversified revenue streams, soaring demand for our products, and ambitious plans make us very excited about the future of Brompton Technology.”

09. JANUARY 2023


LED-based extended reality production has been accelerating ever since the pandemic hit. Now, it’s a widely deployed technology spanning everything from broadcast to live events. We immerse ourselves in the details XR trailblazers

WORDS. Adrian Pennington IMAGES. Various

11. JANUARY 2023


E xtended Reality (XR) – a virtual to be interacting with – continues to gain traction. It’s delivering immersive, innovative experiences in all kinds of scenarios – from reality TV shows and music performances to live sport. “The whole ecosystem involved is just larger, as are the requirements for the technical crew,” says Marina Prak, marketing manager at Roe Visual Europe. “It touches on almost every discipline and product involved. Suddenly, all the stand-alone equipment needs to be interconnected, synced and working together flawlessly. That requires an awful lot of technical backbone to make it all happen. Workflows for all the staff involved need to be integrated and every aspect of the production must be thought through in pre-production.” production technique making use of an LED volume to let talent see the environment they are supposed In live entertainment, XR allows artists to connect with their audiences – even if they may not be in the room with them. In 2020, one of the first major applications of XR was Katy Perry’s performance of Daisies at the American Idol finale. Instead of singing for a live audience, Perry sang her single on an LED stage with real-time graphics rendered on the LED walls and appearing in-camera. “She was immersed in a colourful virtual environment – allowing a virtual audience to experience an incredible performance,” says Tom Rockhill, chief commercial officer for Disguise, whose technology drove the event. “Since then, we’ve seen widespread adoption of XR.” In fact, more than 600 productions and 350 stages in over 50 countries are using Disguise’s XR solution. Last year, DJ Alan Walker used XR to power his virtual concert World of Walker for electric vehicle manufacturer Nio’s annual product launch day. In August, singer-songwriter Camila Cabello made a virtual album launch concert on TikTok using XR.

MATCH MADE IN HEAVEN Spurred on by the requirements for social distancing during the Covid-19 pandemic, XR is ballooning in popularity – marrying the physical and virtual worlds together in a thrilling blend

Karate Combat 35 also used XR to create a hybrid event where in-person attendees and a virtual audience could both see the fighters immersed in a CGI environment. The technology is being used by broadcasters, too, immersing viewers – and particularly guests – into the virtual environment. Plazamedia is at the vanguard, having commissioned Mo-Sys to deliver a turnkey LED studio in Germany ahead of the 2022 World Cup. Using Mo-Sys tech, the new studio blends the real set with the virtual world, creating the illusion of a 360° environment, and adding the creative freedom to deliver real-time 2D keyed, and rich 3D on-air graphics into the scene

– together with multicamera switching at UHD 4K resolution. Initially, XR was expensive. There was a perception that you needed a huge volume similar to the enormous one used on Disney+ show The Mandalorian . “The ecosystem is larger, as are the requirements for the technical crew. It touches on almost every discipline and product”

GOLDEN TROPHY Plazamedia deploys Alfalite LED volume to broadcast the Fifa World Cup in a cutting-edge hybrid studio



SURROUNDED MagentaTV made use of a 33m long LED wall consisting of around 60 panels in its 2022 World Cup coverage, helped by real-time camera tracking from Mo-Sys Startracker



A NEW FRONTIER Roe Visual talks through its innovative Ghostframe technology at IBC 2022 in the RAI Amsterdam

knowledge and skills gap,” says Stephen Gallagher, marketing director at Mo-Sys. Content producers who are considering making the transition to an LED virtual studio need to find a partner who understands the complete workflow, from camera tracking to lighting, and the nuances of LED screen technology. There aren’t many companies with that level of knowledge, capability and experience.” Prak adds: “Not only do we suffer from a more general post-pandemic skills gap, but the skills required to make this all happen are scarce. We need qualified and trained operators and highly technical people capable of making the interconnected ecosystem from camera to LED screen work. The technical set-up and complexity of an XR stage is still underestimated. Where a stable-performing LED screen capable of handling the required technical specs is helping in the process, it’s not the most complicated part of the whole system.” All the while, innovation is advancing. Ghostframe, which is developed by AGS, Megapixel VR and Roe Visual, offers live multisource display and capture. “It’s beneficial in virtual production or XR studios, for either film or broadcast, as it allows you to shoot scenes with multiple cameras in one take,” notes Prak. “Each camera can simultaneously shoot a different background. You can add your XR worlds with the option for hidden tracking or even have director cues only visible in the studio on the volume.”

Now, productions of all sizes are using XR. All you really need is a tracked camera, an LED backdrop, content and sometimes just a basic Disguise set-up. “Since the technology and markets have matured and there are more studios around the world, companies no longer

need to build a stage in order to do an XR production,” says Rockhill. “There are many available for hire. Disguise has a powerful community of studios based around the world that can help you.” There are still challenges, notably around training. “There is definitely a

“Each camera can simultaneously shoot a different background. You can add your XR worlds with the option for hidden tracking or even director cues”

15. JANUARY 2023


That looks forward to a day when the metaverse ‘is going to alter human experience as we know it’, according to Rockhill. “XR provides the intersection and interplay between the physical and virtual worlds, to enhance performance, communication and production. This lets us build a gateway to the metaverse so that live acts can perform in metaverse platforms.” Artists like Ariana Grande, DJ Marshmello, Justin Bieber and Kaskade are already doing it and a best metaverse performance category has been added to the VMAs. “By 2025, we can guarantee there will be a virtual or metaverse layer to almost all the experiences of today,” Rockhill reckons – all powered by the cornerstone technologies of XR at this moment.

Mo-Sys points to its BMR (broadcast mixed reality) solution that combines an LED content server with an MOS- controlled on-air graphics system powered by Unreal. Designed for broadcasters who want to transition to an LED-based virtual studio, with MOS-controlled on-air graphics based on Unreal, this is targeted at sport, news and current affairs programming. Disguise recently teamed up with Pomfort to reduce technical barriers between Livegrade Studio’s established on-set colour grading workflow and the Disguise virtual production workflow. Its cloud platform enables users to collaborate, store and access new versions of products for review. It has also evolved 2.5D workflows, meaning you can use flat 2D images in a 3D view to simulate a full 3D environment. “This gives anyone access to the look of complex and intricate 3D scenes in just a few clicks,” explains Rockhill. “On top of this, we’ve launched Disguise Metaverse Labs, offering R&D, consultation and creative services.”

TURN IT UP The World of Walker virtual concert used Disguise graphics, with the help of 80six and Super Bonfire

“Extended reality provides interplay between the physical and virtual worlds, to enhance performance, communication and production”

STRIKING A POSE Silent Partners and XR Studios combined to create an ultramodern album launch for Camila Cabello’s Familia



GUIDING STAR Absen has hosted events to help explain new developments

INNOVATIVE LED FOR VP Virtual production is thriving, and Absen’s R&D department is working tirelessly to bring products to market. At its recent VP Forum in London, Absen showcased the PR series designed for VP applications

dissipation and provides a 16% reduction in power consumption compared with older products. With the customised lock system, users can quickly complete installations including hanging, stacking and other layouts. Garden Studios’ on-site production team also created a film theatre environment in the VP Studio, using the Absen PL2.5 Pro V10 as the backdrop, which is an ideal choice for VP or XR stage with its superb visual performance. Absen will be showcasing these solutions alongside many others on its booth at ISE 2023, which takes place at the Fira Barcelona from 31 January to 3 February 2023.

developing solutions and hosting events to help explain the latest innovative tech – and highlight new possibilities. We see this as a vibrant market full of potential and we want to continue to collaborate with exciting partners in this industry to keep pushing the boundaries of what we can achieve.” The Forum saw the European debut of Absen’s new flagship VP solution, the PR 2.5. Built of carbon composite with an achievable weight as low as 22kg per m², the PR 2.5 is a lightweight, large panel and is initially available in 2.5mm pixel pitch with 3.9 and 5.2 planned. The PR Series is designed as an all-in-one platform for both VP backdrops and ceilings with a high contrast ratio. A revolutionary product with ultimate display performance, it offers a wide colour gamut, fast heat

ABSEN, ONE OF the world’s leading LED display brands, renowned for its high-quality products and full-service capabilities, recently hosted the Virtual Production Forum at Garden Studios’ dedicated VP production facility in London. Working in collaboration with PSCo, Garden Studios and cinematography hire house Procam Take 2, Absen created the event which combined talks, panel discussions and educational content, alongside a scripted live-action VP short Going Home , complete with acting, direction and editing, all played out in real time. Brian Macauto, VP industry development director for Absen explains: “We are increasingly seeing the impact of LED in filmmaking, and we are pleased to support the emerging virtual production trend by

17. JANUARY 2023


SCREEN TIME GhostFrame technology is helping broadcasters stay ahead of the game

film or broadcast, as it allows you to shoot scenes with multiple cameras in one take. “Each camera can simultaneously shoot a different background or perspective,” explains Prak. “You can seamlessly add your XR worlds with the option for hidden tracking or even autocue or director cues only visible in the studio on the LED volume, to help the talent interact with the background and/or mixed reality elements.” Launched in 2021, GhostFrame – like the rest of the world – was unable to travel. So, it was introduced via online sessions. However, it didn’t take long for broadcasters to realise its potential. “Once people understand how it works and what creative options it could offer, they become instantly enthusiastic,” adds Prak. “On screen, the naked-eye view and the monitors capture the different camera views. At first, it was hard to convince people that it’s not a trick, it happens in real time, but you need to see it to understand the actual value of what GhostFrame offers.” To get a better idea of how GhostFrame works, the LED panels receive up to four dynamic images at the same time. In the second step, the graphical data gets multiplexed into the required production frame rate, up to 1000fps are generated in each LED panel. “This allows us to split up a single production frame into so-called sub-frame ‘slices’. All equipment in the studio keeps the same frame rate, like the cameras and media server,” says Prak. “But instead of only showing a single image for each frame, we are showing multiple images practically simultaneously.” She gives “a straightforward example”, where two separate videos play simultaneously. “Since the media server is still running at its intended frame rate, both videos will play back at regular speed,” Prak continues. “Megapixel’s HELIOS LED processor and PX1 card take up all the data

THE RAPID AUDIENCE shift from traditional television to digital distribution means staying relevant to viewers and ahead of the technological curve is a constant challenge for broadcasters. In other words, they must evolve at pace and shift from just a TV audience to serving audiences wherever they are – essentially, going from one platform to several. What’s more, streamlining the workflow is increasingly important, so broadcasters need a highly skilled operations team to secure a final high-quality camera output. “So, the key word here is automation, API exchange, streamlining the workflow and available metadata of all the components involved,” explains Marina Prak, spokesperson for GhostFrame. “The smarter these components become, and the more data they can exchange, will all contribute to optimising the workflows and production value.” The partners that developed GhostFrame – AGS, Megapixel VR and ROE Visual – are all involved in LED screen display, processing and camera capture, and each is active in broadcasting solutions. GhostFrame, a new technology, developed for virtual production applications, offers live multi-source display/capture with the help of an LED volume. Quite simply, it’s beneficial in virtual production or XR stages, for either



“The processor can automatically invert frames. This is easy to configure”

be two or three cameras seeing something completely different on the volume. SECRET SAUCE After its debut for the Audi World Premier of the Grandsphere Car Concept last year, there are numerous examples of where GhostFrame is currently being deployed. “ Fox NFL Sunday has just implemented the technology, TVN is using the technology after a comprehensive testing procedure and the NEP Group is working with GhostFrame on several projects,” explains Prak. “Recently, GhostFrame was used for an esport event, Dota2 – The International 11, and a pop-up event for a Justwhy production.” Don’t just take ROE’s word for it. Zac Fields, senior vice president of graphic technology and integration at Fox Sports, explains how four cameras can have their own view of and frustum projection onto the LED volume simultaneously. “To someone standing in the LED volume, the displays appear to have the pickup of the different cameras overlaid on each other,” he says. “However, the technology makes it possible for each camera to see only its own image, thereby giving the director an accurate preview of LED volume shots from the four cameras before deciding which one to take to air. GhostFrame is sort of like the secret sauce to making this work in live broadcast.” One advantage afforded to GhostFrame in lockdown was that development teams were able to enhance the product, add final tweaks and act on feedback. GhostFrame was then launched to the public at IBC in September. The teams worked with camera and tracking partners to create an agnostic workflow and technology ecosystem. Stay tuned.

FLEXIBLE TECH With many networks and producers already switched on to the benefits of LED volumes, the multicam functionality has massive creative potential

and synchronise the content to the required needs of the production.” When there’s a camera capable of double or triple-speed recording, it’s also possible to record both videos with different backgrounds from the same camera. This is the basic concept of GhostFrame, but it’s capable of a whole lot more. “Following the VCM technology by AGS, the processor can automatically invert frames. This is easy to configure in the web-based interface,” she adds. “Showing the desired content and its inverse image after each other, results in a light grey. This can be either a regular video that would otherwise merge with the other content or a chroma key for post- production touch-ups.” Moreover, the high frame rate of the LED panels and the possibility of tuning the brightness of each slice means it’s possible to create a near-flicker-free experience for viewers’ eyes. At the same time, there might

19. JANUARY 2023




From museums to themed attractions, stadia and location-based entertainment, visitors are being enticed out of their homes by technology-rich, social-media-primed immersive experiences Transport to new dimensions

WORDS. Adrian Pennington IMAGES. Various

21. JANUARY 2023


P hysical attendance to mass events has returned with a bang, as music concerts, sports stadia and immersive entertainment venues feed audiences craving the live experience. The Covid-19 hiatus seemed to re-energise the whole sector. The venues and events market in EMEA saw a 43% decline in pro AV spend during 2020 against 2019, as audiences largely stayed at home according to Avixa. Growth resumed in earnest last year, with 2023 pro AV revenues from the sector expected to top $9.4 billion. Fuelling this is a renaissance of the venue as a destination. Just as picture houses once afforded people a luxurious sensory escape, a multitude of venues are upping the ante in a bid to offer what streaming services – or as importantly, virtual reality headgear – cannot. That’s jaw-dropping multi-sensory storytelling in the company of strangers. “Immersive experiences are in vogue,” says Lluís Badosa, managing director of Dataton’s Spanish partner, inWO Smart AV. “This kind of experience hits a sweet spot in the convergence between culture and AV technology.” When it opens at The Venetian later this year, the MSG Sphere will house a 160,000 sq ft LED screen at 16x16K, claiming the crown of highest resolution in the world. It will be programmed to show experiential video and to support live events complete with haptics, scent emission and beam-forming audio. Its exterior is clad entirely in another wraparound display. Driving content management and video playback, including video processing of both displays, is technology from 7thSense. “We have spent a long time working to develop a very reliable, top-performance, super-high-bandwidth network storage solution,” explains Rich Brown, 7thSense CTO. “The magnitude of pixels needing to be rendered efficiently is extremely challenging. This is the biggest, highest-spec display we’ve undertaken.” Chief components from 7thSense are banks of Delta media servers, its FPGA- based pixel processor Juggler that plays back live and recorded camera feeds at low latency, plus a new generative product that integrates game engines into the live workflow. Already in Las Vegas and Atlanta, with plans for international export, is Illuminarium. This offers visitors a headset-free form of VR, either of a safari or the surface of the moon displayed on a 360º canvas and using an arsenal of AV tech, such as laser projection, spatial beam-forming audio, in-floor haptics, scent and LiDAR-based interactivity. Brian Allen, EVP of technology and content integration, says: “Making content

possibilities, ranging from stunning shows to real-time, choose-your-own- adventure video game content, to evening events with a DJ. The Now Building in Soho currently claims the world’s largest wraparound screen installation. Opened last November, planned by Outernet with Qvest as systems integrator, its digital canvas is spread four storeys high in 16K resolution. Facilities include automated control and playback of content and immersive audio processing. In Manchester, the £186 million cultural hub Factory International, built on the site of the former Granada television studios, has super-sized,

for Illuminarium Experiences is not as straightforward as making a music video or a traditional film narrative. We think about everything from choosing a signature scent for the show to interaction design, to where people dwell for the longest in our theatres. There’s a lot of empirical data that goes with it, but also a lot of forethought in creating this overall attention-grabbing design. “What makes us unique is not only the fact that we are a communal, digitally delivered experience, but we can change out the content incredibly quickly. That content ranges from real-time, to pre- rendered, to interactive and generative content.” This opens up a plethora of



“What makes us unique is not only that we are a communal, digitally delivered experience, but we can change out the content very quickly”

TAKE IT ALL IN Outernet London (top) is an immersive entertainment district in the capital. Lightroom (right) is home to a spectacular Hockney display

23. JANUARY 2023


movable walls. An immersive The Matrix - themed dance, music and VFX experience directed by Danny Boyle was the opening production last October. Theatre and music performances are being restaged and reinvented using technology-led mixed media. The War of the Worlds: The Immersive Experience uses VR to engross participants in Jeff Wayne’s HG Wells musical; ABBA Voyage is a triumph of holographic performance using technology from Disney’s VFX experts ILM. US theatre company Woolf and the Wondershow spent eight years developing the technology for rock

musical Cages , in which live performers interact with holographic characters created through intricate video mapping. Digital immersive exhibits are shaking up the traditional art world, too. Lightroom opens in King’s Cross this month with work by David Hockney. It’s a four-storey-high digital art space designed for cutting-edge projection. Frameless, in Marble Arch, is another. As The Guardian put it, “Frameless offers 90 minutes of Instagram-friendly, son et lumière experience across 30,000 sq ft of London bunker.” Co-curator Rosie O’Connor says: “Especially post-pandemic, people want that sense of connection and escapism. With all of us on our phones the whole time, I think we’re no longer able to just stand and look and get the same emotional connection.” Nothing says ‘wow’ louder than the screen itself, and the technology to deliver ultra-high-fidelity video to Imag or super- sized Imax-style screens is advancing all the time. Projection mapping turns any object into a display surface, enabling venues to ‘paint with light’. Further facilitating the digital experience are interactive displays for wayfinding, tables and video walls. These engage the visitor, amplify their experience and provide valuable data

“Projection mapping turns any object into a display surface, enabling venues to ‘paint with light’”

points for the venue owner to collect and retarget pundits with marketing. Now, 5G is set to supercharge venues with connectivity – and with it enable lots of previously impossible audience engagement and data tracking applications. For example, the next phase of the 5G rollout (5G Advanced) will mark a milestone for the scalability of navigation and wayfinding in public venues. This will open up real-time mixed reality experiences, such as giving fans accurate positioning information on football players. 5G will also enable the application of AI and ML tools to enhance the venue experience, changing the way guests perceive physical spaces. “Of course,” says Badosa, “there’s always a risk of overexposure when it comes to a new trend, but as long as we find a good balance, these experiences will continue to attract the public.”

VISUALISE A proposed MSG Sphere in London (top). See inside Outernet’s Now Building (above)



GENERATIVE STORYTELLING COMES OF AGE Out-of-home experiences are truly pushing boundaries

AUDIENCES LEAVE THEIR homes to be wowed by experiences they simply cannot get in their living room. Venues are launching bigger, more expansive screens with resolutions many times that of regular high definition. These giant canvases are the focal point for a panoply of cutting- edge audio, lighting and haptic technology capable of delivering astonishing levels of immersion. In turn, we see this creating a dramatic shift from content which is essentially linear in nature to storytelling that is generative in response to the audience and/or the live performance. As a technology partner for the MSG Sphere in Las Vegas, we’re working closely with the MSG team, as this groundbreaking new venue is set to become the apex of experiential innovation, giving audiences the most incredible live audio-visual sensory experiences on the planet when it opens later in the year. These experiences are overwhelmingly social, too. People want to enjoy them out of the home surrounded by family, friends and strangers for shared engagement in the moment. Synthetic metaverse environments accessed by virtual reality cannot begin to simulate this. For XR stages and virtual production, as well as museums, planetariums, theme parks and night-time spectaculars, the number and variety of applications requiring playback of rendered and generative assets synchronously is growing all the time. 7thSense is the backbone to this innovation. Driving content management and video playback is Compere, our software platform that streamlines, simplifies and optimises workflow across unlimited playback and generative sources, as well as managing our Juggler pixel processors for layering in live video content. Our very latest media server hardware range, R-Series 10, is optimised to leverage

WORLD LEADERS Working with some of the most pioneering immersive experience companies, 7thSense is changing the game

“Conjurer, meanwhile, is for managing generative workflows. The effect on the audience is that of a magician creating something out of thin air”

the power of generative engines like Unreal Engine, Notch and Unity. Designed from the ground up and certified for professional use, this 3U chassis benefits from advanced PCIe 4.0 graphics cards and high-speed media storage. As with all our products, we rigorously test and verify all components to ensure they operate 24/7 under the most demanding of conditions. Written to run on the R-Series 10 are our new software applications. Actor is our media server software that ensures pixel-perfect playback even of the highest frame rate, uncompressed and ultra high- resolution media – every time. Conjurer, meanwhile, is for managing generative workflows. The effect on the audience is that of a magician creating something out of thin air, but it is Conjurer powered by R-Series 10 that enables generative engines to play together in

our hardware environment. Both Actor and Conjurer are a part of the Compere ecosystem, our master of ceremonies if you will, that enables end users to control everything from one single interface. These new technologies are being unveiled at ISE 2023. We are excited to tell you about how the highest-quality video playback, pixel processing and control solutions available today are enabling creative artists all over the world to take generative storytelling to the next level. Get in touch with us via and visit us at ISE booth 5E250.

25. JANUARY 2023


LEADING THE PACK Megapixel VR pushes the boundaries with SMPTE ST 2110 and Megapixel Cloud in the latest release of its HELIOS® LED Processing Platform

MEGAPIXEL VR, THE premier authority in video processing and cutting-edge professional display technology, has announced its new HELIOS LED Processing Platform software release, v22.11. With the platform’s powerful upgrades and features, HELIOS becomes the first LED processing platform in the industry to support SMPTE ST 2110, as well as a professional suite of standards which include video, sync and metadata streams over IP for real-time production, playout and other professional media applications. HELIOS supports up to four simultaneous ST 2110 inputs for ultra-large canvases beyond 8K. Stitching capability allows for traditional 2x2, super-tall 1x4, or ultra-wide 4x1 applications. As a result, users gain ultimate flexibility for signal distribution and routing with a complete IP workflow, eliminating the need for dedicated signal conversion gear and legacy video signals with cable length restrictions. “Our launch of ST 2110 support in HELIOS is just another exciting industry first that continues our investment in bringing cutting-edge technologies to the display industry. Along with the exciting new workflows that ST 2110 brings, we are also tremendously excited to be launching our Megapixel Cloud platform, which enables the seamless monitoring and operation of display systems from anywhere in the world,” says Megapixel CEO Jeremy Hochman. Megapixel Cloud is a new service that allows full remote access to display systems via the cloud, a game changer for central maintenance and support of retail deployments and architectural spectaculars, as well as time-sensitive live events and broadcast applications.

“This launch is another industry

first, continuing our investment in cutting- edge technologies”

Latest Updates • Upload multiple images to the HELIOS platform and playback still images as test patterns • Tile-generated test pattern • Sync different tile types when using Tile Low Latency mode • Eco-friendly, low-power tile mode • New fibre stats report • Scheduler functionality allows automation of functions based on the time of day • Additional input signal failover behaviour options • Display Camera Mode and GhostFrame subframe timings as shutter angles or frame fractions • Custom EDID support • Pixel error detection • Hardened security with added support to upload SSL certificates



PIONEERING PLATFORM HELIOS is the first LED processing platform that offers support for the SMPTE ST 2110 suite of standards

27. JANUARY 2023



Intercom solutions are the nexus of production unit activity. We break down the newest and most popular kit on the market

WORDS. Robert Shepherd IMAGES. Various

29. JANUARY 2023


M aking a film or TV show can be a challenging and exhausting experience, regardless of the budget. As any crew member or actor will tell you, the production process is never as glamorous as what adorns the screen. Yet, while visual technological advancements continue to augment what viewers are paying to see, one thing remains constant: reliable communication. Two-way radios – walkie- talkies – used to be the staple of production units. But today, modern intercom systems combine the robustness of the analogue connection with the speech quality of a digital system and latency of a closed system. In other words, things have moved on. When productions resumed following a the Covid-19 hiatus, social distancing meant intercom solutions became ever more critical. There were questions concerning what to expect once the ‘new normal’ arrived. However, Francisco Herrería, technical sales and business development manager at Spanish

broadcast audio manufacturer Altair, doesn’t believe the pandemic irrevocably changed the direction in which technical developments have been moving. “It’s true that intercom use was boosted by the huge growth of streaming during the pandemic, but it did not affect the design of new devices – apart from the videoconferencing world, which is a separate subject.” NETWORKING Herrería explains that ‘the trend nowadays’ is to try to maximise the versatility of the intercom network and number of channels in the system (groups and users), as well as channels a user can access from their belt pack. “Also, integration with other brands and technologies is of great relevance,” he adds. “The use of IP technology, both for cable-based and wireless devices, is changing the landscape of the industry really quickly, with bigger and more flexible matrix-type systems. Several brands are now doing interesting things, offering a wide range of options for the user. That is a major change if we compare the current situation with a decade ago: there are now more manufacturers and a wider range of options – partyline systems, AES67 and Dante systems, other IP solutions.” One brand with a rich history in this space is Clear-Com, and senior product manager Stephen Sandford explains how reliability, flexibility and scalability are the key features to be considered when purchasing a solution. “The intercom is the glue that binds a production together, and needs to be dependable day in and day out,” he says. “It is incredible how critical intercom has become

in TV & film production today, and with the industry’s drive to reduce costs, a clear and reliable intercom system is invaluable in reducing time on-set. The system also needs to have the flexibility to adapt to a variety of different filming needs, whether that’s simply cueing

MAKE OR BREAK Good comms are crucial for smooth filmmaking, and flexibility to deal with change is key

Pliant Technologies CrewCom from Pliant Technologies is designed to help teams of all sizes easily and quickly deploy communications solutions, to connect more people in more places than ever before. Built on a highly scalable platform in which a family of products utilises a proprietary network, CrewCom is a versatile solution available in multiple frequency bands (900MHz and 2.4GHz in North America) as well as EU versions that meet European conformity requirements for use on the continent. CrewCom is designed to overcome communication challenges such as tough RF environments and inconsistent coverage, as well as total user limits that often occur with wireless systems.


Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7 Page 8 Page 9 Page 10 Page 11 Page 12 Page 13 Page 14 Page 15 Page 16 Page 17 Page 18 Page 19 Page 20 Page 21 Page 22 Page 23 Page 24 Page 25 Page 26 Page 27 Page 28 Page 29 Page 30 Page 31 Page 32 Page 33 Page 34 Page 35 Page 36

Powered by