Computer Science, asked by prachikumari693, 5 months ago

Propagation delay in a broadcast network is 5 micro second, and T fr = 10 micro second, find following:

i. How long does it take for the first bit to reach the farthest end on the network?

ii. How long does it take for the last bit to reach the farthest end on the network?

Answers

Answered by ParamSiddharthMain
0

Answer:

(i) 5 microseconds.

(ii) 15 microseconds.

Explanation:

(i) Time taken by the first bit to reach the farthest end of the network

= Propagation delay  

= 5 microseconds

(ii) Time taken by the last bit to reach the farthest end of the network

= Propagation delay + Time taken to transmit one frame entirely

= 5 + 10

= 15 microseconds

Here is another way to look at this.

Assuming the signal travels at the speed of light,

Speed of signal ≈ 3 \times 10^8 m / s

Propagation delay

= 5 microseconds

= 5 \times 10^6 seconds

Longest length to travel

= 3 \times 10^2 \times 5

= 1500 m

Time taken by a frame to be sent (Tfr)

= 10 microseconds

= 10^{-5} seconds

Length of a frame  = 3 \times 10^3

= 3000 metres

Time taken by first bit = Distance / Speed

= 1500 \div (3 \times 10^8)

= 5 microseconds

Time taken by first bit = (Distance + Length) / Speed

= (1500 + 3000) \div (3 \times 10^8)

= 15 microseconds

Cheers.

Similar questions