Physics, asked by lilygladiolus, 8 months ago

It takes 2.51 seconds for a laser signal to go from the earth’s surface to the moon and back. How far is the lunar surface to the earth’s surface? The light travels at a speed of 3.00 x 108 m/s.

Answers

Answered by PiyushNashikkar
1

Explanation:

speed of laser = speed of EM waves=3×10^8m/s

time, t =2.51

v=d/t

d=vt

d=3×10^8 ×2.51

d=7.53×10^8

d=0.753×10^9m

Similar questions