Question 2.26 It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 s. What does this imply for the accuracy of the standard cesium clock in measuring a time-interval of 1 s?
Class XI Physics Units And Measurements Page 37
Answers
Answered by
3
Error in 100 years= 0.02 secs
Error in 1 sec=0.02 s/100×1461/4×24×60×60=2×10^-2×4/1461×24×36×10^4=7.9×10^-13≈10^-12
Error in 1 sec=0.02 s/100×1461/4×24×60×60=2×10^-2×4/1461×24×36×10^4=7.9×10^-13≈10^-12
Answered by
10
==============ⓢⓦⓘⓖⓨ
==============ⓢⓦⓘⓖⓨ
Difference in time of caesium clocks = 0.02 s
Time required for this difference
= 100 years
= 100 × 365 × 24 × 60 × 60
= 3.15 × 109 s
In 3.15 × 109 s, the caesium clock shows a time difference of 0.02 s. In 1s, the clock will show a time difference of
0.02/3.15×10^9 s
. Hence, the accuracy of a standard caesium clock in measuring a time interval of 1 s is
3.15×10^9 /0.02= 157.5× 10^9 = 1.5× 10^11 s
I hope, this will help you
=======================
·.¸¸.·♩♪♫ ⓢⓦⓘⓖⓨ ♫♪♩·.¸¸.·
___________♦♦⭐♦ ♦___________
Similar questions
Business Studies,
7 months ago
English,
7 months ago
Chemistry,
7 months ago
Physics,
1 year ago
Psychology,
1 year ago
Social Sciences,
1 year ago
Science,
1 year ago