SRT Protocol at NAB Show 2019: Breaking New Barriers in Broadcast

Two years after its open source debut, the SRT video streaming protocol seemed to be everywhere at NAB Show 2019. Two demonstrations in particular, from Red Bee Media and Cinegy, truly highlighted the potential of this open source protocol, and what it could mean for the future of video streaming.
Celebrating 2 Years of the SRT

2 years ago Haivision announced the open source availability of the low-latency streaming protocol, SRT. Discover more about its humble beginnings, the recent Emmy award win and what’s in store for the future.
5 Reasons Why SRT Shines for Live Video Over IP Broadcast Workflows

Discover why the SRT open source protocol shines when it comes to addressing the challenges of live video over IP broadcast workflows.
SRT Alliance Membership Continues Rapid Growth: 39 New Members and Counting!

SRT Alliance membership is continuing its rapid growth with 39 new members, including broadcast industry giants Imagine Communications, Net Insight, Red Bee Media, and Telestream.
Using SRT to Live Stream Over the Internet and Other Networks

SRT has been recognized as a cutting-edge development for streaming video over the Internet. But did you know that you could also stream the protocol over other networks? Learn how the SRT protocol’s potential goes beyond what you thought you knew.
How Broadcast Heavyweights like Comcast and NBC Sports Leverage SRT for their Low Latency Delivery

Learn how broadcast experts from NBC Sports, Comcast, and Microsoft have implemented SRT in their low latency workflows, the changes they have seen, and how they are planning to leverage SRT moving into the future.
Why You Should Be Excited About the SRT Protocol Technical Overview

We recently launched the SRT Protocol Technical Overview –learn why this guide is exciting news for both seasoned SRT users and for those in the video streaming world who have yet to take the plunge.
This Open Source Video Streaming Protocol is Completely Disrupting the Broadcast News Industry

Learn about SRT – the open-source protocol that enables Al Jazeera to outpace Twitter and allows Sky News to create new low-cost video contribution streams.
How the NFL, ESPN, and Microsoft are Using SRT to Save Time and Money

In April of 2017, at the National Association of Broadcasters (NAB) convention in Las Vegas, Haivision and Wowza announced the formation of the SRT Alliance. The alliance enables the SRT Open Source Project dedicated to providing lower latency video transport over the public internet. Since that time, more than 100 members have joined the SRT Alliance (as of May, 2018), making it the fastest-growing internet streaming protocol ever adopted by the open source community. A year later, a panel was convened at NAB 2018 to discuss the usage of SRT, why it’s as popular as it is, and to talk about the path forward with such a broad and diverse group of adopters. Moderated by Haivision’s CMO, Peter Maag, the panel looked at several use cases, and had the opportunity to talk about the future of video streaming, and how businesses can get higher quality, lower latency video streams with less infrastructure than ever before, and with less investment in person hours. The NFL, ESPN, and Microsoft all use SRT, and they were more than happy to come to the panel discussion to talk about how they’re using it, and how it has benefitted their video workflows. Let’s take a look at what they talked about, and see how SRT can help you to save time and money. The NFL uses SRT to get live feeds from London to their broadcast center The first to speak at the panel was John Cave, Vice President of Information Technology for the NFL. John is responsible for things like making sure that instant replay is reliable and fast during football games. Because instant replay was centralized in their New York headquarters in 2014, they needed to be able to see what replay officials were looking at on the field from New York in real time so that they can communicate with the official and all be looking at the same frame at the same moment. In the US, they put in very expensive network connectivity to achieve this, things like IP VPNs. And this worked for them in that scenario. However, the NFL is expanding globally and now plays a pre-season game every year in London. And this year, the Kansas City Chiefs will play the Los Angeles Rams in Mexico City. For these games, installing networks of the type they’re using in the United States is not just unfeasible from a financial point of view, but from an infrastructure one, as well. While attempting to solve the problem of instant replays in these instances, they began to take a closer look at SRT, and decided that it could be a viable option. Implementing SRT was simple for the NFL video crew. John said that they just took an SDI feed from a broadcast truck and put it in the SRT enabled Makito X encoder from Haivision, and off they went. SRT allowed us to send the content over the internet, reassemble it on the other end, and we only had to add four times the round trip latency. From London to New York, the latency is 70ms over the wired internet. Recommended SRT settings are 4X the latency, and they ended up with 280ms latency between live content and the feed they were seeing in New York — not even a second behind what they were seeing in London. The best part was that it only took minutes to implement, which allowed the entire crew to dedicate their resources to the important parts of the video feed, instead of having to implement some massive infrastructure system in order to get up and running. In all, SRT is saving the NFL massive amounts of time and money, and giving them exactly what they need to get their very difficult job done. ESPN is helping hundreds of schools broadcast nationally with no further investment in infrastructure Next up, Glenn Scanlon, Senior Director of Transmission at ESPN told the assembled crowd about how they’re using SRT. Glenn manages a team that is responsible for the ingress of content from linear and non-linear platforms, be it a program that is specific for air or for highlight purposes. That same group also manages the distribution of content to their affiliates and to their OTT platforms. That team began using SRT back in 2016 and they’re using it to distribute sports content from schools. ESPN has rights agreements with 14 different college conferences to distribute fully produced content. They have a “campus” in Bristol, CT, where they distribute the content from. In Bristol, they receive the content into their gateway, and then they transmit transport streams over to their decoding farm. In the time that they’ve been using SRT to distribute this content, they’ve produced almost 2,200 different events. This has allowed schools to broadcast content on a national platform that would normally not be on TV. By sending the content from the schools to the Bristol office with SRT, ESPN has not had to invest in further infrastructure and is now able to do something that would not be possible without the protocol. As Glenn said, If we had to do this with satellite uplinks, or even a managed fiber solution, our E3 platform would be very different right now. We wouldn’t be able to do the volume that we do. Just 2,200 minutes alone would be somewhere between $8 million and $9 million in uplink costs. For this type of programming, it’s just not sustainable. For ESPN, SRT has allowed them to distribute content that normally wouldn’t be seen by a national audience, which gives the schools a wider platform, and gives ESPN customers a much broader range of content. Microsoft produces remote events from anywhere with SRT Jeff Tyler, Media Experience Lead for the events and production studios at Microsoft was excited to talk about what SRT has done for them. Microsoft has a production studio facility at their headquarters in Redmond, Washington from which they produce, what Jeff would describe as, “a lot
Why We Created SRT and the Difference Between SRT and UDT

Editor’s Note: This post originally appeared on the GitHub Wiki for SRT. It has been slightly modified for formatting. Some people have asked us why we’re using the UDT library within our SRT protocol. Actually, some people claimed that SRT is just a slightly modified version of UDT and that UDT is known to be useless for live video transmission. Guess what, the latter is true. UDT has been designed for high throughput file transmission over public networks. However, SRT is far from being a slightly modified version of UDT. I’ll get into the details, but will start with a little bit of history. Haivision has always been known for lowest latency video transmission across IP based networks — typically MPEG-TS unicast or multicast streams over the UDP protocol. This solution is perfect for protected networks and if packet loss became a problem, enabling forward error correction (FEC) fixed it. At some point we were getting questioned as to whether it would be possible to achieve the same latency between customer sites in different locations, between different cities, countries or even continents. Of course it’s possible with satellite links or dedicated MPLS networks, but those are quite expensive solutions, so people wanted to use their public internet connectivity instead. While it’s possible to go with FEC in some cases, that’s not a reliable solution, as the amount of recoverable packet loss is limited, unless you accept a significant amount of bandwidth overhead. After evaluating the pros and cons of different third party solutions, we found that none satisfied all our requirements. The lack of insight into the underlying technology drove us to the the decision to develop our own solution, which we then could deeply integrate into products. That way, it would become the “glue” that enables us to transmit streams between all our different products, locally or across far distances, while maintaining our low latency proposition. There were a few of possible choices to consider: The TCP based approach. Problem for live streaming: Network congestion, too slow packet loss recovery. The UDP based Approach. General problem: Packet loss, jitter, packet re-ordering, delay Reliable UDP. Adds framing and selective retransmit. Having had a history with UDT for data transmission, I remembered its packet loss recovery abilities and just started playing with it. Though not designed for live streaming at all, it kind of worked when using really big buffers. I handed it over to one of our extremely talented networking guys in the embedded software team (thanks, Jean!) and asked him whether he’d be able to make this a low latency live streaming solution. I didn’t hear anything back for quite a while and had almost lost my hope, when he contacted me to tell me he had to rewrite the whole packet retransmission functionality in order to be able to react to packet loss immediately when it happens and that he added an encryption protocol, which he had specified and implemented for other use cases before. Nice :-) We started testing sending low latency live streams back and forth between Germany and Montreal and it worked! However, we didn’t get the latency down to a level we had hoped to achieve. The problem we faced turned out to be timing related (as it often is in media). What happened was this: The characteristics of the original stream on the source network got completely changed by the transmission over the public internet. The reasons are delay, jitter, packet loss and its recovery on the dirty network. The signal on the receiver side had completely different characteristics, which led to problems with decoding, as the audio and video decoders didn’t get the packets at the expected times. This can be handled by buffering, but that’s not what you want in low latency setups. The solution was to come up with a mechanism that recreates the signal characteristics on the receiver side. That way we were able to dramatically reduce the buffering. This functionality is part of the SRT protocol itself, so once the data comes out of the SRT protocol on the receiver side, the stream characteristics have been properly recovered. The result is a happy decoder: We publicly showed SRT (Secure Reliable Transport) for the first time at IBC 2013, where we were the only ones to show an HEVC encoded live stream, camera to glass, from a hotel suite outside the exhibition directly onto the show floor, using the network provided by the RAI. Everybody who has been at a show like this knows how bad these networks can get. And the network was bad. So bad that we expected the whole demo to fall apart, having pulled the first trial version of SRT directly from the labs. The excitement was huge, when we realized that the transmission still worked fine! Since then, we added SRT to all our products, enabling us to send high quality, low latency video from and to any endpoint, including our mobile applications. Of course there were improvements to be made and the protocol matured on the way. Until NAB 2017, where we announced that SRT is now Open Source. You can learn more about SRT at the SRT Alliance website here. To view the SRT on GitHub and start contributing to this open-source movement, click here!