The moon is 4x10^8m from the earth, a radar signal transmitted from the earth will reach the moon in about how many seconds?
Please answer with full steps.
Answers
Answered by
91
Distance from Earth to moon = 4×10^8 m
Speed of radar signal = 3×10^8 m/s
Therefore, the time taken by the signal to reach the moon = 4×10^8/3×10^8
= 4/3 seconds
Hope that helps ❤
Speed of radar signal = 3×10^8 m/s
Therefore, the time taken by the signal to reach the moon = 4×10^8/3×10^8
= 4/3 seconds
Hope that helps ❤
Answered by
40
The time taken is 1.3342 s
Given:
Distance =
To find:
Time = ?
Solution:
Velocity is change in displacement with change in time. The unit of velocity is m/s.
So, to determine time, then
We know that, the speed (v) of the radar waves in vacuum =
The speed of radar is less than the speed of light which is
Similar questions