Physics, asked by Pluvion, 6 months ago

A problem related to electricity and about resistance to be connected to a bulb. On using the current in the circuit to find the resistance, I get 20Ω which is the right answer. But on using the formula P=V²/(R+r), I get r=60Ω. What's wrong in using the second method?

Attachments:

Answers

Answered by Λყυѕн
50

Given:

In Case 1

Power(W)=500W

Potential Difference(V) =100V

In Case 2

Power(W)=500W

Potential Difference(V) = 200v

To Find:

Extra resistance that must be put in series with the bulb.

Solution:

In Case 1

Resistance of the bulb(R) = \sf{\dfrac{V^2}{P}}

\sf{R=}{\dfrac{100^2}{500}}

\sf{R=20}Ω

In Case 2

Total current in the circuit:

\sf{V=200V

\sf{P=500W}

\sf{R=20}Ω

\sf{I=}{\sqrt\dfrac{P}{R}}

\sf{I=}{\sqrt\dfrac{500}{20}}

\sf{I=5A}

Total resistance in the circuit:

From Ohm's Law:

\sf{V=IR}

\sf{\therefore R=}{\dfrac{V}{I}}

\sf{R=}{\dfrac{200}{5}}

\sf{R=40}Ω

Therefore, the extra resistance that must be put in series with the bulb is 20Ω.

Similar questions