olaus roemer measure the speed of light in 1675 by
Answers
Answered by
0
It was the Danish astronomer, Olaus Roemer, who, in 1676, first successfully measured the speed of light. His method was based on observations of the eclipses of the moons of Jupiter (by Jupiter).
Roemer noted that the observed time interval between successive eclipses of a given moon was about seven minutes greater when the observations were carried out when the earth in its orbit was moving away from Jupiter than when it was moving toward Jupiter. He reasoned that, when the earth was moving away from Jupiter, the observed time between eclipses was increased above the true value (by about 3.5 minutes) due to the extra distance that the light from each successive eclipse had to travel to reach the earth. Conversely, when the earth was moving toward Jupiter, the observed interval between eclipses was decreased (by about 3.5 minutes) because of the decreased distance that the light had to travel on each successive eclipse.
Had the earth not been moving, the light from successive eclipses would have to travel the same distance to the earth, so that the true interval between eclipses would be observed. However, when the earth was moving away from Jupiter, the light had to travel a greater distance to reach the earth from each successive eclipse, and conversely a smaller distance when the earth was moving toward Jupiter. Since the speed of the earth in its orbit was known, the distance that the earth had moved between eclipses could be calculated. The speed of light was then estimated to account for the seven minute overall variation of the observed interval between successive eclipses.
Roemer's estimate for the speed of light was 140,000 miles/second, which is remarkably good considering the method employed.
Roemer noted that the observed time interval between successive eclipses of a given moon was about seven minutes greater when the observations were carried out when the earth in its orbit was moving away from Jupiter than when it was moving toward Jupiter. He reasoned that, when the earth was moving away from Jupiter, the observed time between eclipses was increased above the true value (by about 3.5 minutes) due to the extra distance that the light from each successive eclipse had to travel to reach the earth. Conversely, when the earth was moving toward Jupiter, the observed interval between eclipses was decreased (by about 3.5 minutes) because of the decreased distance that the light had to travel on each successive eclipse.
Had the earth not been moving, the light from successive eclipses would have to travel the same distance to the earth, so that the true interval between eclipses would be observed. However, when the earth was moving away from Jupiter, the light had to travel a greater distance to reach the earth from each successive eclipse, and conversely a smaller distance when the earth was moving toward Jupiter. Since the speed of the earth in its orbit was known, the distance that the earth had moved between eclipses could be calculated. The speed of light was then estimated to account for the seven minute overall variation of the observed interval between successive eclipses.
Roemer's estimate for the speed of light was 140,000 miles/second, which is remarkably good considering the method employed.
SriVardhan:
if u like the answer, can u mark it as brainliest
Answered by
1
Answer:
Roemer measured the speed of light by timing eclipses of Jupiter's moon Io. In this figure, S is the Sun, E1 is the Earth when closest to Jupiter (J1) and E2 is the Earth about six months later, on the opposite side of the Sun from Jupiter (J2).
Explanation:
HOPE IT HELPS YOU.
Similar questions