>> Solutions must prioritise ultra-low latency with highly efficient codecs such as HEVC << <<
responsibility to ensure that sensitive content is distributed safely and efficiently. Among the most prevalent bottlenecks in real-time streaming is bandwidth limitations. For a content provider experiencing insufficient bandwidth, this can lead to dropped packets and increased latency, which is particularly problematic during high-resolution video streams. Another important consideration in real-time streaming is content delivery network (CDN) capacity. For a content provider dealing with an overloaded or poorly configured CDN, increased latency eventually decreases efficiency depending on server location. Bitmovin Player allows for content providers to stream closer to real time with low-latency Dash and HLS playback, creating a potential solution for the bottlenecks developing in real-time streaming. EVAN STATTON: End-to-end streaming latency is a combination of encoding, transport, network and buffering. In order to achieve real-time streaming, all of these elements require tuning. Traditional codecs like H.264 introduce latency through multi-pass processing, while HTTP-based protocols (eg HLS, Dash) add buffering on both the sender and receiver. UDP-based protocols such as Zixi, SRT and Rist reduce transport latency
through adaptive error recovery. Device and player buffers introduce delay unless carefully tuned. To manage these challenges, industries adopt tailored solutions. For example, sports and news customers may use the lowest- latency protocols possible such as JPEG-XS with Zixi, SRT or Rist for contribution and low-latency HLS with chunked transfers for delivery. Meanwhile, gaming and interactive platforms rely on WebRTC or custom software to enable bidirectional communication, which is even lower latency, but does not scale as effectively. Effective real-time streaming requires end-to-end coordination across the entire workflow.
>> Low-latency protocols come with unique advantages and trade-offs, all depending on the use case <<
Choosing the right combination depends on latency tolerance, audience scale, interactivity and infrastructure compatibility. EVAN STATTON: Each low-latency protocol serves a different need within the streaming landscape. LL-HLS enables delivery over HTTP with latencies as low as two or three seconds, making it ideal for large audiences with passive viewing requirements, such as live sports or events. It scales well with CDNs and adaptive-bit-rate workflows, but doesn’t support true sub-second interactivity. WebRTC, by contrast, delivers real-time performance with sub-500ms latency and supports bidirectional communication, making it ideal for applications like cloud gaming, conferencing and live betting. However, it’s complex to scale because it is non-cacheable and therefore requires specialised CDNs to distribute effectively. Zixi, SRT and Rist offer secure, low-latency streaming over the
isn’t typically used for last-mile delivery to consumers. Many organisations opt for hybrid workflows, combining these protocols to align with specific delivery goals: using SRT for contribution, LL-HLS for mass OTT distribution and WebRTC for ultra-low-latency interaction.
Powered by FlippingBook