QoS - Jitter
Jitter is the deviation from true periodicity. True periodicity is the attribute of being completely periodic. In other words, true periodicity is achieved when the time between events remains exactly the same.
Translating that to the receiving of packets, if you receive 10 packets, and you receive them at
intervals of exactly 29 milliseconds every time, then you have a jitter of zero. This is because there has been zero deviation from true periodicity.
Now if you receive 10 packets at varying time intervals, then there is a deviation, therefore there is jitter. But the most challenging question is, how do you measure it? Well, examining the issue academically, you can see that there are various ways to measure it. Typically, on networks, we measure jitter using the packet arrival times that are often recorded using utilities such as Ping and Traceroute.
Even so, when we calculate jitter, we must take several issues into consideration. Are you measuring the jitter of the round-trip ping or of the arrival of packets at the destination? Are you measuring jitter over a period of time, or over a certain number of packets? Are you measuring the jitter of each packet compared to the previous and next packet or compared to all of the packet arrival intervals? What is the value of the true periodicity that you are comparing arrival times to, the average or some absolute value? These are questions that must be answered before you manually measure jitter.
Ultimately, you shouldn't ever need to calculate jitter manually. You should use network monitoring tools to gain a view of the real jitter on a network to determine if it is causing problems that need fixing. You can also use tools such as IP SLAs to measure jitter across particular links.
Jitter can affect services that are sensitive to variations in the arrival time of packets such as VoIP and video communications.