Home / Jitter
« Back to Glossary Index

Jitter

When referencing jitter in video applications, jitter is the loss of transmitted data between network devices. Jitter happens due to a number of causes, including electromagnetic interference and crosstalk with carriers of other signals.

Jitter is defined as the deviation from the true periodicity of a presumably periodic signal. In the simplest terms, Jitter is a significant, typically undesirable, factor in the design of nearly all communications links. When referring to clock recovery, jitter is frequently known as timing jitter.

When it comes to video or digital images, Jitter takes place when synchronization signals are corrupted or electromagnetic interference is introduced during video transmission. Certain video streaming protocols, however, can effectively mitigate the effects by smoothing variations in timing of packets and feeding them to a given application at a more regular rate.

Latest Blog Articles

Public Safety and Control Room Trends
Explore the latest trends in command centers and see how video wall technology enhances situational awareness in public safety.
SOC vs NOC Guardians of IT and Cybersecurity
An organization's digital health depends on efficient, secure IT infrastructure. Learn how SOCs and NOCs play a critical role in multiple sectors.
What is low latency streaming? Learn more about video latency, when it's critical, why it matters, as well as tips on how to reduce it.

Speak With One of Our Experts to Learn More!

« Back to Glossary