Going the Distance: RTMP vs. SRT
In our latest white paper, RTMP vs. SRT: Comparing Latency and Maximum Bandwidth, our video experts set out to answer a frequently asked question: when it comes to streaming live video over the internet, how do RTMP and SRT compare? If you’re not sure which transport protocol you should be using for streaming live video with low end-to-end latency over the internet, read on to discover more about how these commonly used protocols stack up against each other.
If you want to skip this post and get straight to the details -go ahead and download the white paper!
Before we begin with details of the testing, here’s a very high-level primer on both protocols. If you want to take a deeper dive into their differences, our CMO, Peter Maag, recently wrote a great blog post about them here.
The Real-Time Messaging Protocol (RTMP) is a mature, well-established streaming protocol with a reputation for reliability thanks to its TCP-based pack retransmit capabilities and adjustable buffers. Although support for Flash technology will end in 2020, RTMP remains a commonly-used protocol for live streaming video.
Secure Reliable Transport (SRT) is an open source video transport protocol and technology stack which uses an intelligent packet retransmit mechanism called ARQ (Automatic Repeat reQuest) on top of a UDP data flow, along with AES-128 and 256-bit encryption. A relative newcomer, SRT was open sourced in 2017 and has seen rapid growth in its adoption and support.
Putting the Protocols Through Their Paces
With a simple and easy to replicate setup requiring no special equipment, the benchmark tests examined how both protocols performed over public networks, exploring how much buffer was required, what latency looked like and whether there was a limit on the bandwidth used. The tests also aimed to answer the question of how far a video stream can travel across the world before it fails.
Comparing End-to-End Latency
The first test was measuring the impact of using RTMP or SRT on round trip, end-to-end latency. This includes; the encoding of the video signal, the time needed for the stream to travel to its target destination (in this case Australia, US West Coast, US East Coast and Central Europe) and return to its original location (Germany), the decoding of the video signal and finally the display latency and buffering of involved servers, software players and hardware decoders.
As expected, the further the distance to the stream target destination, the bigger the impact on end-to-end latency. In these tests, compared to RTMP, SRT was more than twice as fast and, when tested using dedicated hardware encoding and decoding equipment, the difference was even more dramatic with SRT being 5 to 12 times faster than RTMP.
Testing for Maximum Bandwidth for Long Distance Streams
Measuring the impact of each protocol on latency is, of course, important, but what about their impact on video quality? An easy way to improve video and audio quality is to simply increase the bandwidth used for streaming, so the next step was to test the maximum bandwidth for long distance streams.
Thanks to the Microsoft Production Studios in Redmond, Washington, we were able to test high bandwidth streaming using a true 1 Gbps internet connection for streams ranging from 1 to 20 Mbps. RTMP worked well when both the sender and receiver were on the same continent but failed at long distances at bitrates above 2 Mbps. SRT, on the other hand, experienced no issues streaming up to 20 Mbps to any of the locations tested across the world.
And the Winner Is…
No prizes for guessing that SRT packs a powerful punch and consistently outperformed RTMP when tested in real-world conditions. To take a closer a look at the test setup and the full results in detail, download the whitepaper: RTMP vs. SRT: Comparing Latency and Maximum Bandwidth.