Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

As described in G.810, the phase variation of a reference timing signal is commonly separated into three components:

“jitter, wander and effects of frequency offsets and drifts. In addition, phase discontinuities due to transient disturbances such as network rerouting, automatic protection switching, etc., may also be a source of phase variation.”

The (timing) jitter is defined as the “short-term variations of the significant instants of a timing signal from their ideal positions in time (where short-term implies that these variations are of frequency greater than or equal to 10 Hz).”, while wander is defined as “The long-term variations of the significant instants of a digital signal from their ideal position in time (where long-term implies that these variations are of frequency less than 10 Hz)

The 10 Hz is by convention defined in Telecom as the separation between jitter and wander. Noise generated by higher frequencies is generally due to different causes of noise generated at lower frequencies, but the separation between jitter and wander has mainly to do with the effect of the noises that can be different depending on the frequency components.

Wander accumulates in the network and may result in buffer underflows/overflows and/or the ability to lock to the incoming timing reference. High level of wander may also result in excessive time error in the nodes for the case of time sync applications.

Excessive level of jitter can create bit errors (and related packet drops), or reduce the margin to other sources of errors.

Plot showing a timing signal with both short term jitter and long term wander

As per G.810, “The intent of both network and equipment jitter specifications is to ensure that jitter has no impact on the error performance of the network. The intent of jitter specifications is to provide sufficient information to enable equipment designers to accommodate the expected levels of phase variation without incurring unacceptable degradations.”

With respect to jitter, due to the type of noise that is under consideration and the negative impacts that must be avoided, it can be measured over short periods. In particular, as per relevant ITU-T recommendations (e.g. G.8262.1), it should be measured as peak-to-peak phase variation over a 60-second interval.

With respect to jitter effects, IEEE 802.3, specifies objectives for the various interfaces in terms of BER (Bit Error Ratio). These objectives are never better than 10–12. As an example, in 1 Gbps link 10–12 corresponds to about 1 bit over about 15 minutes, or over 1.5 minutes in case of 10 Gbps links.

When performing jitter measurements over periods longer than 60 seconds (e.g. several hours) isolated spikes may be recorded during the test measurement. These may be due to the System Under Test or to the measurement set up (e.g. environmental conditions, etc.).

As long as these remain isolated spikes, e.g. impacting at most 1 bit over 2.5 hours (i.e. corresponding to 10-13 at 1 Gbps), these may be ignored and can be considered outside the scope and the intention with the jitter measurement. In fact, 1 bit in 2.5 hours would meet the typical performance objectives with good margin.

  • No labels