Computer Science, asked by chicagosingless1911, 1 year ago

What is transmission time of a packet sent by a station if length of packet is 1 million bytes and bandwidth of channel is 200 kbps?

Answers

Answered by gurukulamdivya
15

Answer:

transmission time = (packet length)/(bandwidth)

= 1million bytes / 200kbps

= 1000000 bytes / 200000 bps

=0.5s

Answered by mariospartan
9

Given:

Length of the packet = 1 million

Bandwidth = 200 kbps

To Find:

Transmission Time

Explanation:

transmission time = (packet length / bandwidth)

= 1 million bytes / 200kbps

Since 1 byte is 8 bit and bps is bit per second

= (1 million × 8)/ 200

= 8000000 bits / 200000 bps

= 40 seconds

The transmission time of a packet = 40 seconds

Similar questions