Home / Jitter
« Back to Glossary Index

Jitter

When referencing jitter in video applications, jitter is the loss of transmitted data between network devices. Jitter happens due to a number of causes, including electromagnetic interference and crosstalk with carriers of other signals.

Jitter is defined as the deviation from the true periodicity of a presumably periodic signal. In the simplest terms, Jitter is a significant, typically undesirable, factor in the design of nearly all communications links. When referring to clock recovery, jitter is frequently known as timing jitter.

When it comes to video or digital images, Jitter takes place when synchronization signals are corrupted or electromagnetic interference is introduced during video transmission. Certain video streaming protocols, however, can effectively mitigate the effects by smoothing variations in timing of packets and feeding them to a given application at a more regular rate.

Latest Blog Articles

SOC vs NOC Guardians of IT and Cybersecurity
An organization's digital health depends on efficient, secure IT infrastructure. Learn how SOCs and NOCs play a critical role in multiple sectors.
haivision - SVG Europe Winter sport Broadcasting
In this interview, Haivision shares how mobile video, 5G, SRT, and cloud workflows are transforming Winter Sports broadcasting.
Ruggedized Video Processing for Demanding Environments
Haivision’s Kraken X1 Rugged delivers AI-powered, low latency video processing at the edge for remote ISR and public safety operations.

Speak With One of Our Experts to Learn More!

« Back to Glossary