What is the difference between Delay and Jitter?

Q.  What is the difference between Delay and Jitter?
- Published on 19 Oct 15

a. Delay takes less time for buffering while jitter takes more time for buffering.
b. Delay is defined as the end to end time required for the signal to travel from transmitter to receiver and Jitter is defined as the variation of delay for packets belonging to the same flow.
c. Time delay is a variable in delay but not in jitter.
d. All of the above

ANSWER: Delay is defined as the end to end time required for the signal to travel from transmitter to receiver and Jitter is defined as the variation of delay for packets belonging to the same flow.
 

    Discussion

  • Prajakta Pandit   -Posted on 16 Oct 15

    Delay :

      - Delay is the amount of time.

      - It takes a bit to be transmitted from source to destination.

      - Delays are caused by distance, errors, congestion and other factors.

      - Delays of distance called propagation delays are especially critical when transmitting data to other countries.

      - Delay is also significant with the satellite transmission.

      - The survey of TCP performance issues will give result in delay.

      - To avoid the delay problem, QoS (Quality of Service) and differentiated services are mostly used.


    Jitter :

      - Jitter is delay that varies over time.

      - Jitter is a periodic signal in telecommunication network.

      - Jitter period is the interval between two times of maximum or minimum effect of a signal that varies with time.

      - Jitter may be caused by electromagnetic interference (EMI) and crosstalk with other signals.

      - Jitter can affect on the performance of processors in personal computers, undesired effects in audio signals and loss of transmitted data between network devices etc.

Post your comment / Share knowledge


Enter the code shown above:

(Note: If you cannot read the numbers in the above image, reload the page to generate a new one.)