An electric bulb is rated 220 V and 100W. When it is operated on 110V, the power consumed will be?
solution A:
p=VI
100=220I
5/11=I
P=VI
P=110*5/11=50W
Solution 2:
Power consumed by the electric bulb, P=(V^2)/R, where V is the potential difference and R is the electrical resistance.
So, if at 220V bulb consumes 100 W,
⇒100=(220^2)/R
⇒R=(220^2)/100=484Ω
So, when it is operated at 110V,
Power consumed
=110^2/R
=110^2/484
=25W
first solution we get 50w, and the second we get 25w. so which of them is correct?
please tell, I got so confused
Answers
Answered by
1
- First one is correct 50W should be the answer as P is directly proportional to V
- So when V was decreased P should also decrease
- V decreased by half so P should also decrease by half so P becomes 100/2=50W
BrainlyGod:
Awsome
Similar questions