Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

The (timing) jitter is defined as the “short-term variations of the significant instants of a timing signal from their ideal positions in time (where short-term implies that these variations are of a frequency greater than or equal to 10 Hz).”, while wander is defined as “The long-term variations of the significant instants of a digital signal from their ideal position in time (where long-term implies that these variations are of frequency less than 10 Hz)

The 10 Hz is by convention defined in Telecom telecoms as the separation between jitter and wander. Noise generated by higher frequencies is generally due to different causes of noise generated at lower frequencies, but the separation between jitter and wander has mainly to do with the effect of the noises that can be different depending on the frequency components.

Wander accumulates in the network and may result in buffer underflows/overflows and/or the ability to lock to the incoming timing reference. High level of wander may also result in excessive time error in the nodes for the case of time sync applications.

Excessive level levels of jitter can create bit errors (and related packet drops), or reduce the margin to other sources of errors.

...

With respect to jitter effects, IEEE 802.3, specifies objectives for the various interfaces in terms of BER (Bit Error Ratio). These objectives are never better than 10–12. As an example, in for a 1 Gbps link 10–12 corresponds to about 1 bit over about 15 minutes, or over 1.5 minutes in the case of 10 a10 Gbps linkslink.

When performing jitter measurements over periods longer than 60 seconds (e.g. several hours) isolated spikes may be recorded during the test measurement. These may be due to the System Under Test or to the measurement set up (e.g. environmental conditions, etc.).

As long as these remain isolated spikes, e.g. impacting at most 1 bit over 2.5 hours (i.e. corresponding to 10-13 at 1 Gbps), these may be ignored and can be considered outside the scope and the intention with of the jitter measurement. In fact, 1 bit in 2.5 hours would meet the typical performance objectives with good margin.