Physics, asked by PragyaTbia, 1 year ago

It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 s. What does this imply for the accuracy of standard cesium clock in measuring a time-interval of 1 s.

Answers

Answered by gadakhsanket
63


Hii dear,

# Answer- Degree of accuracy = approx 10^11.

# Given-
Time interval = 100 years = 3.154×10^11 s
Observed error = 0.02 s

# Solution-
Fractional error = Observed error/time interval
Fractional error = 0.02/3.154×10^11
Fractional error = 6.34×10^-12

Degree of accuracy = 1/Fractional error = 1/(6.34×10^-12)
Degree of accuracy = 1.5×10^11.

Degree of accuracy will be 1.5×10^11 i.e.approx 10^11.

Hope that cleared your doubt...
Answered by Anonymous
50

==============ⓢⓦⓘⓖⓨ

\huge\mathfrak\red{hello...frd\:swigy\:here}

==============ⓢⓦⓘⓖⓨ

Difference in time of caesium clocks = 0.02 s

Time required for this difference

= 100 years

= 100 × 365 × 24 × 60 × 60

= 3.15 × 109 s

In 3.15 × 109 s, the caesium clock shows a time difference of 0.02 s. In 1s, the clock will show a time difference of

0.02/3.15×10^9 s

. Hence, the accuracy of a standard caesium clock in measuring a time interval of 1 s is

3.15×10^9 /0.02= 157.5× 10^9 = 1.5× 10^11 s

I hope, this will help you

=======================

<marquee behaviour-move bigcolour-pink><h1>☺ThankYou✌</h1></marquee>

·.¸¸.·♩♪♫ ⓢⓦⓘⓖⓨ ♫♪♩·.¸¸.·

___________♦♦⭐♦ ♦___________

Similar questions