Science, asked by airtravel9823, 11 months ago

If supply frequency to the transformer is increased, the iron losses

Answers

Answered by No1Brandedkamina
1

Answer:

As frequency increases nothing will happen at first. The ratio will remain constant and the transformer will work fine up to it rating.

As the frequency significantly departs from the design frequency losses will increase faster than the basic transformer behavior (which favors higher frequency). The transformer will become hotter under load, and the voltage drop will become higher. The transformer may run hot under simple excitation (no load). Load will cause the voltage to drop faster than expected.

Beyond this frequency the load carrying capability will be compromised, but the transformer will follow the no-load turns ratio as always. With load, the voltage will drop off considerably. The transformer will run hot at all possible loads (as limited by temperature).

Why does this all happen? Others have explained the correct equations.

Transformers favor higher frequency. They do not work at all at DC. If you bump up the frequency a little they work even better.

Similar questions