The electrical impedance of the windings would be so different running at 400 Hz instead of 50 Hz that the transformer's output voltage and current-carrying capacity would be very different to what it was originally designed to handle.
The only safe way to experiment with a transformer that was designed to operate at standard mains voltage and frequency would be in an electronics laboratory.
In a laboratory a safe method of operation and the right equipment and test instruments could be used. The method would probably be to vary the transformer's: * load, starting from a high resistance value,
* supply voltage, starting from zero, and
* supply frequency, starting from 50 Hz so that the resulting output voltage and current could be measured. The test results could then be compared with theoretical calculations of what the effects are likely to be of using a supply frequency of 400 Hz instead of 50 Hz.
The correct symbol for 'hertz' is 'Hz', not 'hz'.
No, it cannot. The reason is that the no-load current for a transformer is determined by the impedance of the primary winding, which is mainly inductive reactance. Inductive reactance is directly-proportional to the supply frequency, so the no-load current at 50 Hz will be significantly higher than at 400 Hz, due to an eight-fold reduction in its inductive reactance, and will cause the primary winding to overheat and, possibly, cause its insulation to fail.
Used in this way, the transformer will tend to overheat. But it is quite okay to operate a 50-Hz transformer at 60 Hz.
A transformer makes and collapses a magnetic Field inside 60 times a second for home power in North America, 50 times a second in Europe. The building and collapsing of a magnetic field "Induces" a voltage in another transformer winding. DC current only creates a stable magnetic field and this will not create or induce voltage in the other winding. Normally covered in grade 5 science where I live.
Watts are power. If the lights were mostly or totally switched off, you'd have a circuit generating 600W of heat somewhere if the transformer still took 600W, not only that, but when you switched on, the 600W that the transformer was consuming, would not disappear, so the total drain would be 1.2kW. ---- Don't understand the above answer. The 600 watts on the transformer nameplate is the maximum amount of wattage that the transformer can produce and still be within its safety limits. It doesn't draw that wattage all the time. If you had two 50 watt lamps connected to the transformer then the transformer has the capacity of 500 watts left. The transformer will only produce the wattage that the load requests. The transformer has the ability to supply twelve 50 watt bulbs. 12 x 50 = 600. Any more bulbs than 12 and the transformer is in an overload condition.
50 VA means about 50 watts. Transformers usually use VA instead of watts because a transformer has very little wasted power, and watts measure power. A 50 va transformer that is 120v. on the primary side will use about .41 amps at 120 volts. On the secondary side, (if it's 24 volts) it will support about 2.08 amps.
50 amp
There is no inherent disadvantage of 50 Hz compared with 60 Hz, bearing in mind that systems that run at 50 Hz are designed to run at 50 Hz.
A transformer makes and collapses a magnetic Field inside 60 times a second for home power in North America, 50 times a second in Europe. The building and collapsing of a magnetic field "Induces" a voltage in another transformer winding. DC current only creates a stable magnetic field and this will not create or induce voltage in the other winding. Normally covered in grade 5 science where I live.
If your device uses 900 Watts at 7.5 Amps, then it requires 120 volts. If you want to use it where the supplied current is 220 volts, then you'll need a transformer - but only if the device can operate on 50 Hz. Most places that use 220 Volts supply it at 50 Hz. If your device says it can operate on 50 Hz you can use a transformer.
Not unless you have a 110 volt supply to plug it in to. The standard General Power Outlet in Australia is 240 volts AC at 50 Hertz.
A microwave oven is always switched on, even when it isn't cooking anything. It needs to be in order to recognize your commands and to have a clock. This means that there is a power transformer, and if the transformer laminations are slightly loose, they will vibrate in response to the 60 Hz magnetic field. That means that you will hear a 60 Hz or 120 Hz hum. If your power supply is some other frequency, like 50 Hz, you will hear a 50 Hz or 199 Hz hum. An expensive encapsulated transformer would fix this.
The wave trappers are used to control the frequency of of the power supply from transmission lines.. for example in India we need 50 Hz supply, but from the supply lines we will receive the supply in (say) 54Hz. In order to obtain 50 Hz, wave trappers are used in sub-stations or in switch yards.. hence the wave trappers are located initially in-between the lightning-arrester and the current transformer in the primary side as shown below: power supply - lightning arrester - wave trapper - current transformer - ......-primary of the transformer
Your stereo system has an internal transformer in the power supply that is designed to operate at a given frequency. If it is designed to operate at the load your stereo system will use at 50 Hz, then yes. Otherwise this transformer will overheat when connected at the wrong frequency, and will burn out. You might be lucky and the power supply may be labelled to operate at 50 or 60Hz; if not, I recommend contacting the manufacturer. A voltage transformer will convert 50Hz 220 to 50Hz 110. To change frequency, you need special equipment.
The current and voltage reverse twice for each cycle of the supply. That is 100 times per second on a 50 Hz supply. This behaviour is used because it allows the power to converted from one voltage to another easily and efficiently by the use of a transformer.
Hi - it should be OK but the transformer might overheat a little if it is used intensively, so it need to be used carefully. The reason is that the magnetic flux in the core is 15-20% greater at 50 Hz, and that increases the heat produced in the transformer. Going in reverse, a 50 Hz transformer will work fine on 60 Hz.
You can apply a lower-than-rated voltage to the primary winding of a transformer, and the secondary winding will then alter by the same proportion. So, for example, for a step-down transformer, if a 230-V primary voltage results in, say, a 115-V secondary voltage, then applying a 50-V primary voltage will result in a 25-V secondary voltage.
Watts are power. If the lights were mostly or totally switched off, you'd have a circuit generating 600W of heat somewhere if the transformer still took 600W, not only that, but when you switched on, the 600W that the transformer was consuming, would not disappear, so the total drain would be 1.2kW. ---- Don't understand the above answer. The 600 watts on the transformer nameplate is the maximum amount of wattage that the transformer can produce and still be within its safety limits. It doesn't draw that wattage all the time. If you had two 50 watt lamps connected to the transformer then the transformer has the capacity of 500 watts left. The transformer will only produce the wattage that the load requests. The transformer has the ability to supply twelve 50 watt bulbs. 12 x 50 = 600. Any more bulbs than 12 and the transformer is in an overload condition.
50 VA means about 50 watts. Transformers usually use VA instead of watts because a transformer has very little wasted power, and watts measure power. A 50 va transformer that is 120v. on the primary side will use about .41 amps at 120 volts. On the secondary side, (if it's 24 volts) it will support about 2.08 amps.
The ratio would be a 50:1 current transformer.