The moon is 4x10^8m from the earth, a radar signal transmitted from the earth will reach the moon in about how many seconds?
Please answer with full steps.
Answers
Answered by
26
velocity of signal = velocity of light =3×(10)^8 m
time = distace / velicty = 4×10^8/3×10^8= 4/3 sec
time = distace / velicty = 4×10^8/3×10^8= 4/3 sec
Answered by
12
we know that
speed = distance / time
or
time = distance / speed
here,
distance between earth and moon = 4 x 10^8m
speed of radar signal = 3 x 10^8m/s [speed of light ]
so,
the time taken by radar signal to reach the moon will be
t = 4 x 10^8/ 3 x 10^8
thus,
t = 1.33 s
Similar questions