Physics, asked by tripathiseema3pdrlk1, 1 month ago

two bulbs rated 40W 220v and 100W 220v are connected in a series with a source of 220v what is the ratio of power consumed in them​

Answers

Answered by souradipdas42
0

Answer:

Resistance of 100Wbulb=200

2

/100=400ohms

Resistance of 60Wbulb=200

2

/60=666.67ohms

Resistance of 40Wbulb=200

2

/40=1000ohms

Therefore, total resistance in series = (400+666.67+1000)=2066.67ohms

Current in the circuit =200/2066.67=0.0967A

Therefore, actual power consumed by “40W”bulb=0.0967

2

x1000=9.35W (much lesser than any of the original)

The 40W bulb will grow the brightest as the current is constant in all three and it has the maximum resistance. But it would consume much less than 40W as the bulbs are connected in series, and voltage would be divided across all three filaments depending upon resistances.

Similar questions