Cooling will impact load losses (Since the resistance of copper will increase as it's temperature increases); Core losses are a characteristic of the transformer, so there is nothing you can do about them after it has been made. Insuring you aren't overexciting the transformer (overvoltaging) may help if you periodically have high voltages.
A: Remove any mounting iron and replace them with brass and make sure the winding are closely wound and finally choose the iron core carefully
by decreasing power loss like eddy current & magnetic hysterisis
No! A transformer changes voltage levels, not power levels. In fact, the output power of a transformer is actually a little lower than its input power, due to the efficiency of a transformer.
there are several losses in a transformer that prevent it from attaining 100% efficiency. One is core loss, which can be divided into Hysteresis losses, Eddy currents and Magnetostriction loses. see for more details http://en.wikipedia.org/wiki/Transformer#Energy_losses
its efficiency will decresed.
Eddy currents act to increase the temperature of a transformer's core above ambient temperature, resulting in a loss of energy through heat transfer -thus reducing its efficiency.
67%
nope
The "all day" efficiency of a transformer is defined as the ratio of energy out/energy in for a given all day cycle.
the efficiency is maximum in a transformer when no load loss is equal to load loss.
No! A transformer changes voltage levels, not power levels. In fact, the output power of a transformer is actually a little lower than its input power, due to the efficiency of a transformer.
there are several losses in a transformer that prevent it from attaining 100% efficiency. One is core loss, which can be divided into Hysteresis losses, Eddy currents and Magnetostriction loses. see for more details http://en.wikipedia.org/wiki/Transformer#Energy_losses
its efficiency will decresed.
The maximum efficiency condition in distribution transformer is said to be occurred when iron loss = copper loss
Eddy currents act to increase the temperature of a transformer's core above ambient temperature, resulting in a loss of energy through heat transfer -thus reducing its efficiency.
That is the maximum efficiency occurs when the copper losses are equal to the core losses of the transformer.
in certain power amplifiers(cls A,cls C etc.) the output sometimes not directly taken. rather it's coupled with transformer and then transferred power is applied to the load..this is transformer coupling..... there are two main reasons to do so 1. in order to increase the efficiency.(exmpl: efficiency of class A power amplifier increase by almost 25%) 2.in order to get smooth i.e; distortion less output.. by Jishnu Mukherjee (Kolkata,India)
The transformer will have the maximum efficiency.
Maximum efficiency of a power transformer occurs when copper loss equals to iron losses. Decrease in current does not result in increase in efficiency unless the copper loss was more than iron loss and the decreased current made the copper loss is reduced and became equal to iron loss at some point.