The amp rating for a 100VA transformer will vary depending on the actual voltage of the transformer. Transformers have both a primary and a secondary voltage.
100VA / 12V = 8.33 A on the 12 volt side.
50 amp
Divide VA rating with the Voltage and get Amp rating. So say you have a transformer that delivers 26V, 80VA, that means a maximum current of 80/26=3.07A for a resistive load.
100MVA
Knowing the power rating of a transformer will help an operator use the transformer within its design limitations with regard to heating of the windings and their insulation.
No; drawing more than the rated amperage from a transformer will cause it to overheat.
50 amp
No because the current rating of the transformer is a maximum allowable current. If the computer still takes 3.42 amps it will be OK provided the new transformer supplies the correct voltage.
Your transformer should have a namplate on it that states how many amps or fractions of amps it can produce. You would then multiply that number by your secondary voltage to get your VA rating. sec. voltage 12v X .05 amp = 12 X .05= .6va
Divide VA rating with the Voltage and get Amp rating. So say you have a transformer that delivers 26V, 80VA, that means a maximum current of 80/26=3.07A for a resistive load.
100MVA
Depends on the kva rating of the devices to be tested using a transformer.
If the load you are connecting to the transformer uses 1.5 amps or less, yes.
Knowing the power rating of a transformer will help an operator use the transformer within its design limitations with regard to heating of the windings and their insulation.
What limits the use of a transformer is its operating temperature, as excessively-high temperature will act to break down its insulation. The temperature reached by a transformer is a function of its rating (expressed in volt amperes), so operating a transformer below its rating is perfectly okay.
Is 120 V the primary or secondary voltage? If the primary (input) voltage is 120, then at full load the transformer will draw about 0.42 A from the line, and the current delivered to the load depends on the secondary (output) voltage. If the secondary (output) voltage is 120, then at full load the transformer will deliver about 0.42 A to the load, and the current drawn fom the line depends on the primary (input) voltage. amps = watts / volts So, for instance, if your transformer has a 120 V primary, and a 24 V secondary, as you might find in your AC unit or furnace: Primary current (at full load) - A = W / V A = 50 / 120 A = 0.4166 Secondary current - A = 50 / 24 A = 2.08
No; drawing more than the rated amperage from a transformer will cause it to overheat.
A transformer can be used to change the voltage to an appliance. The voltage rating of the transformer should be right for the voltages used, and the current rating of the transformer should not be less than the current drawn by the equipment.