answersLogoWhite

0


Best Answer

VA, or volt-amperes, is a measure of the output of the transformer. VA is also called apparent power. Watts on the other hand, is also called true power. If the transformer feeds a resistive load, then watts are equal to VA, because the voltage and current are in phase. If the transformer feeds a reactive load, such as a motor, then the voltage and current are no longer in phase, and the true power (watts) is less than apparent power.

Since the true power, or watts delivered can change depending on the load, it is not very useful as a transformer rating. The VA remains constant irregardless of the load characteristics, and so is a much better indicator of transformer performance.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

10y ago

The transformer has a maximum rated voltage, determined by the size of the magnetic core and the frequency, and, quite separately, a maximum rated current determined by the size of the copper wire in the coils. Multiply the two together to find the VA or kVA rating. The kilowatts are equal to the kVA times the power factor of the load that is connected to the transformer.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

I assume this question is asking why Transformers are rated in VA as opposed to watts. The reason is because the transformer doesn't not care about the phase angle relationship of the voltage applied to it to the current flowing through it. All that matters is the magnitude of the current flow, which dictates the level of heating the transformer will experience.

Alternative Answer

A transformer's iron losses depend on the magnitude of the flux which, in turn, is proportional to voltage, while its copper losses depend on the winding currents. As both iron and copper losses contribute to the maximum operating temperature of the transformer, it follows that a transformer must be rated in terms of voltage and current. In alternating current systems, the product of voltage and current is apparent power, expressed in volt amperes.

As a transformer's secondary voltage is kept approximately constant, it is its 'volt ampere' rating that determines its maximum (secondary) load current.

Expressing a transformer's rating in watts (i.e. true power) would be completely meaningless because, with a highly-reactive load, it will be supplying practically zero watts while still having to supply its rated current.

This answer is:
User Avatar

User Avatar

Wiki User

10y ago

Because a transformer is rated separately for maximum voltage and maximum current. Multiply the two together to find the rated kVA.

Excessive current causes overheating in the copper wire windings because of the resistance of the wire, while excessive voltage causes overheating in the iron core because of eddy current losses.

This answer is:
User Avatar

User Avatar

Wiki User

9y ago

Transformers has two types of losses. 1) Copper loss or variable loss 2) Iron loss or core loss or constant loss. Copper loss depend on Current similarly Iron loss depend on Voltage. So the Trfr. ratings would be in volt amps.

<<>>

As the answer says, there are separate limits on the voltage and the current a transformer can use, so you just multiply them together to get the volt-amp limit.

Also, this covers cases where a load with a poor power-factor is connected to the transformer.

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

Transformers generally power AC loads which have some reactance and the load is therefore expressed in VA (Volt Amperes). Heaters and light bulbs are an exception as they are nearly pure resistance and would be expressed in Watts. VA is actually the wattage of your load transformers and are rated according to output voltage and current. For a given unit, the volt-ampre (VA) capacity is often specified. This is the product of the voltage and current. A transformers with a 12V output, capable of delivering 10A, would be rated as 120VA (12V x 10A).

==============================

And now to try and answer the question, i.e., whythe 'VA' specification of a

transformer is more important than its 'watts' rating:

The current and voltage exert separate stresses on the transformer, and impose

separate limits on it:

-- The current in the windings determines the magnetic saturation of the core material.

When the core approaches saturation, the losses and heating in the core rise rapidly.

This is regardless of what the voltage might be.

-- The voltage between the windings determines at what point there could be

arcing between them, with a breakdown of the whole operation of the transformer.

This is regardless of what the current might be.

If the transformer is operating in a low-power-factor environment (large phase

angle between the voltage and current), then the power it's handling may be

relatively very low ... deceptively so. Either the voltage or the current may be

approaching the transformer's limit, but you wouldn't know it if you're only

monitoring the watts of power. If you're monitoring the VA, then a high value

of either voltage or current immediately shows up.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why transformer rated in KVA instead in KWH?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Engineering

Why transformer rating is called kva not kw?

Transformers are rated in KVA because that is a more accurate way to measure their capacity requirements. KWH is apparent power, while KVA is true power, and the ratio between them is power factor. The power factor is a function of the load, and not the transformer, so a poor power factor would make KWA look less to the transformer while, in fact, the true power, if not met by the transformer, could overload the transformer.


How do we Calculate Kwh to Kvah?

To convert 'kwh' to 'kvah' you first need to measure the length of time. You will then convert this amount to hours by dividing by 3,600. You will then divide this amount by the length of time.


How do you convert kwh to amps?

There is no kva in volts, kva stands for kilo = 1000, volt = working voltage, amps = maximum amperage draw at that voltage. eg. 1000 VA =100v at 10 amps or 200v at 5 amps. If you had a transformer that had a primary voltage of 480 volts and you need a secondary voltage at 240, with a maximum load of 5 amps, the transformer rating should be 1.2kva or 1200VA. On the premise 480V primary, amperage will be 2.5 amps. Secondary 240V, amperage will be 5 amps. 480 x 2.5 = 1.2 KVA, 240 x 5 = 1.2 KVA. Transformer size is 1200 VA or !.2 KVA.


How to convert kva-kwh?

To convert from KVA (kilovoltamperes) to KWH (kilowatthours) first convert to KW (kilowatts) by multiplying by power factor. Power factor is the cosine of the phase angle between voltage and current. Then multiply by the number of hours that you run the load.


How do you calculate 3 phase kWh from amps measured on each phase?

The voltage and current will give the kVA, but the kW depends on the power factor of whatever load is connected to the supply. For a (let's say) 11 kV supply, the voltage from line to neutral is 11,000/sqrt(3) which is 6351 v. The kVA on each phase is 6.351 times the current, and you just add up the three kVA values to find the total. At higher voltges like 11 kV the three currents in the lines are usually very nearly equal.

Related questions

Why transformer rating is called kva not kw?

Transformers are rated in KVA because that is a more accurate way to measure their capacity requirements. KWH is apparent power, while KVA is true power, and the ratio between them is power factor. The power factor is a function of the load, and not the transformer, so a poor power factor would make KWA look less to the transformer while, in fact, the true power, if not met by the transformer, could overload the transformer.


What ratio ct to use for a 250 to 5 rated kwh meter?

The ratio would be a 50:1 current transformer.


How do we Calculate Kwh to Kvah?

To convert 'kwh' to 'kvah' you first need to measure the length of time. You will then convert this amount to hours by dividing by 3,600. You will then divide this amount by the length of time.


Is there a difference between kWh and KWH?

There is no difference in the meaning of kWh or KWH. Both forms of writing kWh mean "kilo watt hours," and the format acceptable to most in the technical community is "kWh."AnswerThe correct symbol for kilowatt hour is kW.h.


How do you convert kwh to amps?

There is no kva in volts, kva stands for kilo = 1000, volt = working voltage, amps = maximum amperage draw at that voltage. eg. 1000 VA =100v at 10 amps or 200v at 5 amps. If you had a transformer that had a primary voltage of 480 volts and you need a secondary voltage at 240, with a maximum load of 5 amps, the transformer rating should be 1.2kva or 1200VA. On the premise 480V primary, amperage will be 2.5 amps. Secondary 240V, amperage will be 5 amps. 480 x 2.5 = 1.2 KVA, 240 x 5 = 1.2 KVA. Transformer size is 1200 VA or !.2 KVA.


Why you used different Current Transformer ratios?

The selection of CT Sizing is based on total connected load. If for example a Main CB of 2000 KVA Trafo is 4000 A -- then best selection would be 4000:5 -- this would match also in the Energy Meter (KWH Meter) with 800 as multiplier.


How to convert kva-kwh?

To convert from KVA (kilovoltamperes) to KWH (kilowatthours) first convert to KW (kilowatts) by multiplying by power factor. Power factor is the cosine of the phase angle between voltage and current. Then multiply by the number of hours that you run the load.


What is the fuel consumption of 5 kva generator?

if your generator fuel consumtion is 275g/kwh, then you multipli 5kva*.8=4kw. (4*275)/1000=1.1 liter per hour.


Working principle of trivector meter?

trivector meters are used to measure kVAh and also kVA of maximum demand.it has a kwh meter and reactive kvah meter in a case with special summator mounted between them.


Difference between kw and kwh?

KW is the unit power KWh is the unit of electrical energy KW or Watt defines the rating or power of a electrical equipment. For ex: motors, heaters etc. KWh is a measurement that how much energy is been consumed by the electrical equipments. Generally, operating 1KW rated equipment for an hour measures one KWh. Energy meters are the instruments for this purpose


What is fullform of kwh?

KWH


What is the btu value of 341744380 kwh divided by 508068160 kwh?

0.6726