Computer Science, asked by SehajbirKang2796, 5 months ago

A system uses the Selective Repeat ARQ Protocol with window size of 4. If
each packet carries 1000 bits of data, how long does it take to send 1 million
bits of data if the distance between the sender and receiver is 5000 Km and the
propagation speed is 2 x 108 m/s? Ignore transmission, waiting, and processing
delays. Assume 25% of the frames lost during every fresh transmission.

Answers

Answered by wk292589
0

Answer:

.

.

.

mark

Explanation:

as

BRAINLIEST

Similar questions