Physics, asked by rajat62651, 10 months ago

In an experiment to determine the acceleration due to gravity g, the formula used for the time period of a
periodic motion is T = 2???? ⎷ 7 (R-r)/ 5g. The values of R and r are measured to be (60 ± 1) mm and (10 ± 1)
mm, respectively. In five successive measurements, the time period is found to be 0.52 s, 0.56 s, 0.57 s,
0.54 s and 0.59 s. The least count of the watch used for the measurement of time period is 0.01 s. Which of
the following statement(s) is(are) true?
(A) The error in the measurement of r is 10%
(B) The error in the measurement of T is 3.57%
(C) The error in the measurement of T is 2%
(D) The error in the determined value of g is 11%

Answers

Answered by rameshsisodia79
0

Answer:

The observed values of time period, T

i

=0.52 s, 0.56 s, 0.57 s, 0.54 s and 0.59 s

The mean value of time period, T=

5

∑T

i

=

5

2.78

=0.56 s

Magnitude of absolute error in each observation, ∣ΔT

1

∣=∣0.56−0.52∣=∣0.04∣ s

Similarly, ∣ΔT

2

∣=∣0.0∣ s ∣ΔT

3

∣=∣0.01∣ s ∣ΔT

4

∣=∣0.02∣ s ∣ΔT

5

∣=∣0.03∣ s

Mean absolute error in time period, ΔT

m

=

5

0.04+0.00+0.01+0.02+0.03

=0.02 s

∴ Error in T,

T

ΔT

m

×100=

0.56

0.02

×100=3.57 %

Error in the measurement of r:

r

Δr

×100=

10

1

×100=10 %

From the equation given, we get: g=

5T

2

28π

2

(R−r)

∴ Error in the measurement of g:

g

Δg

×100=

(R−r)

ΔR+Δr

×100+2

T

ΔT

m

×100

g

Δg

×100=

(60−10)

1+1

×100+2(3.57) % =11.14 %

Similar questions