Physics, asked by shimirajesh01, 8 months ago

If the bandwidth of the channel is 5 Kbps, how long does it take to send a frame of 100,000 bits out of this device?

Answers

Answered by 5488
0

Answer:

here is your answer mate

hat is the bit rate for each of the following signals?A signal in which 1 bit lasts 0.001 sA signal in which 1 bit lasts 2 msA signal in which 10 bits last 20 ms7) A device is sending out data at the rate of 1000 bps.How long does it take to send out 1

Answered by fariakhan6768
0

Answer

20 sec

Explanation:

Given bandwidth of a channel is 5Kbps that means the channel can send 5000 bits per second.

As it takes 1 second to send 5000 bits, the amount of time required to transfer 100,000 bits is 20 seconds.

100000/5000 = 20 sec

Similar questions