Want this question answered?
This typically has to do with how many amps you can safely pull from the secondary of the transformer.
It depends on how many amps it was designed for. A 12.5kV/600v 10kVA 3 phase transformer can handle ~.5 amps on the primary and ~10A on the secondary. A 600/120V 10kVA 3 phase transformer can handle ~10A on the primary and ~50 on the secondary.
The formula you are looking for is I = W/E. Amps = Watts/Volts.
There are actually three pieces to this puzzle - resistance. And you're missing that one.
for three phase the calculation is 30,000 = 1.73*V*I - simple as that. For single Phase the calculation is 30,000 = V*I - simple as that It is important to note the voltage in the first line is Line to Line (typically how it is specified in three phase power systems), and the second line it is Line to neutral. A 30KVA transformer is the same as 30,000VA to find out the Amps you need to divide the voltage if the transformer is single phase for example: 30,000VA / 480V = 62.5 Amps The calculation for a 3 phase transformer is the VA / voltage / 1.73 for example: 30,000VA /480V / 1.73 = 36.12 Amps
A 100kVA transformer is rated for...100kVA. That is its' power rating, and it is based off the current that is flowing through the transformer (the I^2*R losses are the limiting factor). This can be 80kW and 60 kVARs, or 100kW and 0 kVARs, or 100kVARs, or anywhere inbetween.Another AnswerThe 'power' rating of a transformer is the product of its secondary voltage and its secondary current, expressed in volt amperes or multiples thereof. It's not expressed in watts, because to know the 'true power' of the transformer, the manufacturer will need to know the power factor of the load, and that could vary considerably. Incidentally, the symbol for kilovolt ampere is 'kV.A', not 'kVa'.
This typically has to do with how many amps you can safely pull from the secondary of the transformer.
2.083 amps
To answer this question a voltage must be given.
The transformer itself does not pull current. Whatever you connect to the transformer pulls current. Whatever the output voltage of the transformer is, divide that into 600 and you get maximum current possible without burning up the transformer. At 24V that's 25 amps.
It depends on how many amps it was designed for. A 12.5kV/600v 10kVA 3 phase transformer can handle ~.5 amps on the primary and ~10A on the secondary. A 600/120V 10kVA 3 phase transformer can handle ~10A on the primary and ~50 on the secondary.
It depends on the rated voltage of its secondary.
The formula you are looking for is I = W/E. Amps = Watts/Volts.
Rephrase your question, as it doesn't make any sense. If the primary side of the transformer is 480 volts 3 phase, this transformer can be supplied from a breaker as big as 180 amps. If 480 volts 3 phase is your secondary then you can supply up to 180 amps to your loads.
The primary current of a transformer depends upon the secondary current which, in turn, depends upon the load supplied by the transformer. There is not enough information in the question to determine the rated primary and secondary currents of the transformer.
There are actually three pieces to this puzzle - resistance. And you're missing that one.
Take the KVA and divide it by the voltage. 25/.230 = 109 amps. The transformer can put out up to 50% more that its rated for short durations. So you could get around 150 amps out of a 25 Kva tranformer in a worst case situation.