To answer this question a voltage must be given.
The number of amps a transformer can carry on its secondary side depends on its power rating (in watts or VA) and the voltage of the secondary winding. You can calculate the current (in amps) using the formula: Amps = Watts / Volts. For example, if you have a 1000 VA transformer with a 10V secondary, it can carry 100 amps (1000 VA / 10V = 100A). Always ensure the transformer is rated for the desired load to avoid overheating or damage.
To calculate the amperage in the secondary side of a transformer, you can use the formula: Amps = kVA / (Volts x Sqrt(3)). For a 250 kVA transformer with a 220-volt secondary, the amperage will be approximately 660.4 Amps.
To calculate the amperage drawn by a 240V 12VA transformer, use the formula: Amperage = Power (VA) / Voltage (V). In this case, it would be 12VA / 240V = 0.05A. Therefore, the transformer would draw 0.05 amps.
The purpose of a transformer is to transform one voltage to another voltage. This can be in the configuration of stepping up the voltage or stepping down the voltage . The load is what establishes what the current from the transformer is going to be.
A transformer does not use, it transforms voltage from one value to another. The output amperage is governed by the connected load. If the load wattage is higher than the wattage rating of the transformer then either the primary or secondary fuse will blow or the transformer will burn up if the fusing is of the wrong sizing. The maximum primary amperage can be found by using the following equation, Amps = Watts/Volts, A = W/E = 600/120 = 5 amps. The same equation is used for the calculating the maximum secondary amperage, A = W/E = 600/12 = 50 amps.
To determine the amps for a 500 kVA transformer, you can use the formula: Amps = kVA × 1000 / (Voltage). For example, at a standard voltage of 480V, the calculation would be 500,000 VA / 480V, which equals approximately 1041.67 amps. The specific current will vary based on the voltage level used with the transformer.
500 KVA how many amps? almost 650 Amps according to formula.
This typically has to do with how many amps you can safely pull from the secondary of the transformer.
2.083 amps
The number of amps a transformer can carry on its secondary side depends on its power rating (in watts or VA) and the voltage of the secondary winding. You can calculate the current (in amps) using the formula: Amps = Watts / Volts. For example, if you have a 1000 VA transformer with a 10V secondary, it can carry 100 amps (1000 VA / 10V = 100A). Always ensure the transformer is rated for the desired load to avoid overheating or damage.
The transformer itself does not pull current. Whatever you connect to the transformer pulls current. Whatever the output voltage of the transformer is, divide that into 600 and you get maximum current possible without burning up the transformer. At 24V that's 25 amps.
It depends on how many amps it was designed for. A 12.5kV/600v 10kVA 3 phase transformer can handle ~.5 amps on the primary and ~10A on the secondary. A 600/120V 10kVA 3 phase transformer can handle ~10A on the primary and ~50 on the secondary.
It depends on the rated voltage of its secondary.
To calculate the amperage in the secondary side of a transformer, you can use the formula: Amps = kVA / (Volts x Sqrt(3)). For a 250 kVA transformer with a 220-volt secondary, the amperage will be approximately 660.4 Amps.
The formula you are looking for is I = W/E. Amps = Watts/Volts.
I=Kva*1000/v*1.732 =500*1000/415*1.732 =500,000/718.78 =695.62 Amps. So max.load of 500kva DG is 695.62 Amps
Rephrase your question, as it doesn't make any sense. If the primary side of the transformer is 480 volts 3 phase, this transformer can be supplied from a breaker as big as 180 amps. If 480 volts 3 phase is your secondary then you can supply up to 180 amps to your loads.