Physics, asked by banitasinha1984, 17 days ago

If a 220V – 100W bulb is connected to an 110V circuit then the brightness of the light
produced by the bulb be same or different – explain.​

Answers

Answered by trijalmuralidhar123
2

Answer:

A 100W 220V rated bulb run at 110V will operate with a filament temperature of around 2150˚K instead of the normal 2800˚K, and consume a power of around 35W. See Paul Grimshaw's answer to A light bulb is marked 120 V, 60W. The actual outlet voltage is 95V due to a heavy load. What is the actual power dissipated in the light bulb? for the calculations involved.

The much lower filament temperature means that the light produced is shifted much further over to the red end of the spectrum. This has two consequences. Firstly the resulting orange-red light will not be very useful for normal lighting. Secondly a much greater proportion of the bulb output will be past red into the invisible infra-red region of the EM spectrum, and also our eyes are much less sensitive to red light that it does manage to produce compared to the usual yellowy white light of a fully lit incandescent bulb. This means that the bulb will be very much less efficient than normal, ie: far fewer lumens per watt than a 35W light operating at normal filament temperature. So the effective brightness is going to be negligible, certainly nothing approaching a 25W bulb.  

Explanation:

So the bottom line is that all you get from the light is a dull red-orange glow, very inefficiently produced from 35W of electricity, and the light from which is not suitable and not bright enough to effectively illuminate.

On the bright side (pun intended) the result is quite a nice ornamental light if you use a clear bulb to display the glowing filament, and the low operating temperature means that it will last many times as long as it would at 220V.

Similar questions