HOW CAN BROADCASTERS AND STREAMERS GUARANTEE SYNCHRONISATION ACROSS DEVICES, NETWORKS AND REGIONS WHILE MAINTAINING LOW LATENCY?
DAN PISARSKI: The technique for synchronising different video streams – embedding timestamps in the out-of-band video data – was not common even a few years ago, but today it is (although not quite standardised). This technique involves adding global timestamps (such as those taken from NTP or PTP) into the headers of frames or GOPs for the video, so that different streams can have their ‘absolute time’ compared to re- align frames from production that occurred at the same time. Far more encoders support this technique today than ever before, and more production platforms use it to align frames – LiveU Studio’s cloud-native production solution is a good example. Sources of such timestamps are now easier to access, with PTP more common and NTP more accurate. VENUGOPAL IYENGAR: Ensuring synchronised playback across devices and geographies while keeping latency low is one of the biggest challenges in live streaming. Frame- accurate synchronisation becomes difficult when viewers are using different platforms, network conditions vary and content is being processed and delivered through many nodes. Solutions typically combine timecode watermarking, network-aware buffering and CDN-level delivery alignment. Time-aligned manifests and device-level playback controls can help reduce drift. Advanced scheduling platforms can manage stream variations and enforce timing precision. Real-time analytics also play a critical role. By continuously monitoring latency variations and synchronisation issues, broadcasters can trigger corrective actions – adjusting delivery paths, changing buffer sizes or even switching between CDN nodes to maintain consistency. Ultimately, combining AI-driven adaptive scheduling, automated monitoring and cloud- based orchestration gives a strong framework for maintaining sync while minimising delay, ensuring a seamless viewing experience.
they have a dedicated air/cable/satellite channel. Closing the gap requires very low- latency contribution (using HEVC ULL), a suitable distribution protocol such as LL-HLS (supported end-to-end) and consumers that have appropriate network connection. MATTHEW WILLIAMS-NEALE: Traditional broadcast remains superior in real-time delivery due to dedicated infrastructure and deterministic transmission paths, which ensure minimal delay. Streaming introduces latency through various stages – buffering, adaptive-bit-rate selection and CDN propagation – particularly over public or heterogeneous networks. However, advancements are helping narrow this gap. Optimised encoding schemes, better edge compute integration and smarter transport protocols now enable sub-second streaming with robust reliability. Ensuring synchronisation, timing control and latency tuning across workflows is essential. Progress will continue as livestreaming infrastructures adopt more dynamic, modular architectures capable of handling high-throughput, low- latency video workflows – while offering the flexibility to adapt to changing demands.
>> The method for syncing different video streams was not common <<
Powered by FlippingBook