Physics, asked by codeist, 1 year ago

The moon is 4x10^8m from the earth, a radar signal transmitted from the earth will reach the moon in about how many seconds?

Please answer with full steps.

Answers

Answered by Chocolatelover
91
Distance from Earth to moon = 4×10^8 m

Speed of radar signal = 3×10^8 m/s

Therefore, the time taken by the signal to reach the moon = 4×10^8/3×10^8
= 4/3 seconds

Hope that helps ❤
Answered by skyfall63
40

The time taken is 1.3342 s

Given:

Distance = 4 \times 10^{8} \ m/s

To find:

Time = ?

Solution:

Velocity is change in displacement with change in time. The unit of velocity is m/s.

v=\frac{d}{t}

So, to determine time, then t=\frac{d}{v}

We know that, the speed (v) of the radar waves in vacuum = 2.9979 \times 10^{8} \ m/s

The speed of radar is less than the speed of light which is 3 \times 10^{8} \ m/s

\Rightarrow t=\frac{4 \times 10^{8}}{2.9979 \times 10^{8}}

\therefore \text {Time}=1.3342 \ \mathrm{s}

Similar questions