the efficiency is maximum in a transformer when no load loss is equal to load loss.
The "all day" efficiency of a transformer is defined as the ratio of energy out/energy in for a given all day cycle.
It depends on the load. A good transformer has over 90% (some as high as 99%) efficiency. So the power drawn by it is a function of the power in the load, plus a small amount due to losses in the transformer.
100MVA
Short circuit test and open circuit test are widely used to test the efficiency of the transformer.
A variable transformer is capable of changing its output voltage from 0 to maximum output or over a specific range. It is also named an adjustable transformer.
The maximum efficiency condition in distribution transformer is said to be occurred when iron loss = copper loss
That is the maximum efficiency occurs when the copper losses are equal to the core losses of the transformer.
The transformer will have the maximum efficiency.
It is always desirable to run any equipment or device at maximum efficiency for that matter, not only the power transformer. Power transformer maximum efficiency occurs when copper loss is equal to iron loss. (or no load loss equals to load loss). This does not necessariliy mean that maximum efficiency occurs at maximum or full load. Generally the maximum efficiency occurs at relatively less than full load of the transformer.
The "all day" efficiency of a transformer is defined as the ratio of energy out/energy in for a given all day cycle.
The efficiency of a simple transformer is limited by resistive loss in the wiring, and by hysteresis (magnet related) losses in the transformer core. You may limit the resistance loss by using superconductors at very low temperatures. But not practical for most situations. (Yet!)
nope
there are several losses in a transformer that prevent it from attaining 100% efficiency. One is core loss, which can be divided into Hysteresis losses, Eddy currents and Magnetostriction loses. see for more details http://en.wikipedia.org/wiki/Transformer#Energy_losses
It depends on the load. A good transformer has over 90% (some as high as 99%) efficiency. So the power drawn by it is a function of the power in the load, plus a small amount due to losses in the transformer.
its efficiency will decresed.
100MVA
Maximum efficiency of a power transformer occurs when copper loss equals to iron losses. Decrease in current does not result in increase in efficiency unless the copper loss was more than iron loss and the decreased current made the copper loss is reduced and became equal to iron loss at some point.