Physics, asked by raabeasamreen786, 11 months ago

The potential difference across the 100 ohm resistance in the following circuit is measured by a voltmeter of 900ohm resistance.
The percent error in reading the potential difference is

Attachments:

Answers

Answered by handgunmaine
9

Answer:

The percent error in reading the potential difference is 1 % .

Explanation:

Let , source voltage be V .

Therefore , voltage across 100\ \Omega before connecting voltmeter is :

V_1=\dfrac{100}{100+10}V\\\\V_1=\dfrac{100}{110}V ( By voltage divider rule )

Now , voltage across 100\ \Omega after connecting voltmeter is :

V_2=\dfrac{R_{eq}}{R_{eq}+10}V       ........ ( 1 )

Here ,

\dfrac{1}{R_{eq}}=\dfrac{1}{100}+\dfrac{1}{900}\\\\R_{eq}=\dfrac{900\times 100}{1000}\ \Omega\\\\R_{eq}=90\ \Omega

Putting value of R_{eq} in equation 1 .

We get :

V_2=\dfrac{90}{100}V\\\\V_2=0.9V

Therefore , percentage error is :

Percentage\ Error=\dfrac{V_1-V_2}{V_1}\times 100\\\\\dfrac{\dfrac{10}{11}V-0.9V}{\dfrac{10}{11}V}\times 100\\\\\dfrac{\dfrac{10}{11}-0.9}{\dfrac{10}{11}}\times 100=1\ \%

Therefore , the percent error in reading the potential difference is 1 % .

Similar questions