Want this question answered?
kVA = 1000va Therefore 1000/220 Answer 4.54A
A transformer does not use, it transforms voltage from one value to another. The output amperage is governed by the connected load. If the load wattage is higher than the wattage rating of the transformer then either the primary or secondary fuse will blow or the transformer will burn up if the fusing is of the wrong sizing. The maximum primary amperage can be found by using the following equation, Amps = Watts/Volts, A = W/E = 600/120 = 5 amps. The same equation is used for the calculating the maximum secondary amperage, A = W/E = 600/12 = 50 amps.
To answer this question a voltage must be given.
The 220 Volt secondary in a single phase transformer rated 2kVA, should be able to deliver about 2000/220 or 9.1 Amperes, assuming unity power factor.
70 amps.
It depends on the rated voltage of its secondary.
This typically has to do with how many amps you can safely pull from the secondary of the transformer.
It depends on how many amps it was designed for. A 12.5kV/600v 10kVA 3 phase transformer can handle ~.5 amps on the primary and ~10A on the secondary. A 600/120V 10kVA 3 phase transformer can handle ~10A on the primary and ~50 on the secondary.
The primary current of a transformer depends upon the secondary current which, in turn, depends upon the load supplied by the transformer. There is not enough information in the question to determine the rated primary and secondary currents of the transformer.
kVA = 1000va Therefore 1000/220 Answer 4.54A
Rephrase your question, as it doesn't make any sense. If the primary side of the transformer is 480 volts 3 phase, this transformer can be supplied from a breaker as big as 180 amps. If 480 volts 3 phase is your secondary then you can supply up to 180 amps to your loads.
This 480-v three-phase transformer probably has a 208-v three-phase secondary which has 120 v from each line to neutral. In that case the primary current is 0.433 times as much as the secondary current, so 100 amps in the secondary means 43.3 amps in the primary.
You will need a 3:1 ratio transformer. An output current of 20 amps and a secondary voltage of 47 volts, results in a transformer rated at 940 VA.
A transformer is a power source. It will provide voltage to a device. Find the voltage rating on the device, say 24V. 250/24 = ~10A.
A 100kVA transformer is rated for...100kVA. That is its' power rating, and it is based off the current that is flowing through the transformer (the I^2*R losses are the limiting factor). This can be 80kW and 60 kVARs, or 100kW and 0 kVARs, or 100kVARs, or anywhere inbetween.Another AnswerThe 'power' rating of a transformer is the product of its secondary voltage and its secondary current, expressed in volt amperes or multiples thereof. It's not expressed in watts, because to know the 'true power' of the transformer, the manufacturer will need to know the power factor of the load, and that could vary considerably. Incidentally, the symbol for kilovolt ampere is 'kV.A', not 'kVa'.
Take the KVA and divide it by the voltage. 25/.230 = 109 amps. The transformer can put out up to 50% more that its rated for short durations. So you could get around 150 amps out of a 25 Kva tranformer in a worst case situation.
Your transformer should have a namplate on it that states how many amps or fractions of amps it can produce. You would then multiply that number by your secondary voltage to get your VA rating. sec. voltage 12v X .05 amp = 12 X .05= .6va