Physics, asked by samreenhasiat, 11 months ago

A transmitter produces radio waves of wavelength 1500 m. It takes the waves 0.025 s to travel
from the transmitter to a radio receiver.
What is the distance between the radio transmitter and the receiver?

Answers

Answered by abhi178
40

answer : 7.5 × 10^6 m

here you should know that radio waves are a type of electromagnetic waves of very large wavelength as given wavelength = 1500m.

and we know as well, speed of electromagnetic waves = 3 × 10^8 m/s

so, speed of radio waves , u = 3 × 10^8 m/s

given, time taken by radio waves to travel from the transmitter to a radio receiver , t = 0.025 sec

now, distance between the transmitter and the receiver = speed of radio wave × time taken

= 3 × 10^8 m/s × 0.025 sec

= 3 × 10^8 × 1/(40)

= 7.5 × 10^6 m

Similar questions