WHITE PAPER

Synchronizing Video Sources Over the Internet for Live Remote Production

STREAM SYNC WITH THE MAKITO X SERIES FOR LIVE EVENT COVERAGE

fox-logo

“The beauty of the Haivision technology is that it turns right on, everything just lines up and locks in, video is being transported, and people are ready to go to work. It’s a testament to how battle tested it is.”

– Brad Cheney, VP of Field Operations and Engineering, Fox Sports

INTRODUCTION

Nothing engages viewers like live video. Whether it’s for television or corporate communications, live video can make people feel as if they are truly taking part in the event, no matter where they are watching from. For producers with tight budget constraints, planning for a live event involves tough choices between deploying remote production staff and the cost of transmittingvideo. Traditionally, live production of remote events requires an on-site crew of camera operators, sound engineers, and a technical director. Adopting a remote production model can reduce production costs and logistical complexity by reducing the burden of deploying expensive resources: the equipment required to capture, process, and produce at a remote venue and the field crew needed to set up, operate, and manage it. Creating greater efficiencies allows broadcasters to produce more events and deploy their best resources more effectively. Although costs can be significantly reduced by managing live production workflows from a main master control room (MCR), sometimes referred to as remote integration or REMI, the additional bandwidth typically required for transmitting multiple contribution video feeds over satellite or a dedicated network can negate the savings of having a centralized live production facility. In this white paper, we will explore how broadcasters can leverage the latest video streaming technologies to satisfy the demands of remote production workflows without the traditional costs and logistical complexities.

“As production people, we use whatever solutions are best of breed, that’s what our clients expect. The Haivision Makito X Series of video encoders are just that; the UI is simple and easy to use, they offer extremely low latency, and most importantly, they are rock solid and reliable. These products are mainstays in our studio.”

– Corey Behnke, Co-founder and Producer, Live X

THE CHALLENGE OF SYNCHRONIZING MULTIPLE CAMERA STREAMS OVER IP

While broadcasting live events, the use of multiple cameras allows for a more engaging and dynamic viewer experience. For remote locations such as sports stadiums or concert venues, a producer needs to be able to seamlessly switch between live video feeds depending on what angle is most suitable at a given time. Typically, a single audio stream is used, as sudden changes in audio are very noticeable and can be distracting. If the video is not synchronized, switching between cameras can result in issues such as input lag and lip-sync delay.

At the live production facility, decoders receiving the live feeds need to be kept in sync so that a producer can immediately include any of the sources within their live playout workflow. One way to help mitigate multi-camera and audio sync issues is by multiplexing camera feeds over satellite uplink, although this can be a costly solution. Another option is to use a dedicated private network that can provide a stable level of latency and therefore the ability to manually sync video and audio feeds, although this is not always possible from remote locations. Streaming over the internet is a more cost-effective and flexible approach however bandwidth availability is difficult to predict and can change at any given moment. Being able to synchronize remote contribution streams over the internet resolves the dilemma between managing costs and ensuring broadcast quality. Keeping live video and audio in sync while streaming over IP networks can be a considerable challenge. Especially when dealing with an unpredictable network like the internet where round trip times and bandwidth availability can continually fluctuate. In order to ensure that all video and audio streams are in sync with each other, broadcast and network engineers need to spend considerable time manually adjusting the timing of each video decoder output. Typically, this is done using a test pattern device to calibrate audio channels with live video sources. This approach requires coordination between people at both the remote location and at the MCR and can be very time-consuming. The more cameras and audio channels involved, the more complicated it becomes to synchronize everything, and the more time needed before going on air. Although with the right tools, this approach can be made to work, there is a simpler and faster way.

READ THE FULL WHITE PAPER