Transformers are rated in KVA and MVA. The load determines the power factor. It is obvious to multiply the VA with cosine angle to get Watts or Kilo watts. While manufacturing the transformer or at its installation we don't know the load power factor. Hence we could not determine the rating in Watts or kilo watts or Mega watts. We simply note it as VA or KVA.
S.Dhanabal
Most a.c. machines, both transformers AND generators are rated in volt amperes. D.C. generators, on the other hand, are indeed measured in watts.
The reason for this is that, in a.c. circuits, the product of voltage and current is called 'apparent power' and is expressed in volt amperes. So the product of a transformer's or generator's rated voltage and rated current will be its rated apparent power in volt amperes.
To determine an a.c. machine's 'true power' in watts, it's necessary to know the type of the load it will be supplying and, of course, the manufacturer has no way of knowing what this will be.
A 132 kV substation is normally called a grid substation. It would normally use two or more 132/33 kV transformers rated at 90 MVA, or two or more 132/11 kV transformers rated at 30 MVA.
By definition, MVA is equivalent to the vector sum of MW and MVAR: MVA^2 = MW^2 + MVAR^2 = 2500 MVA = 50
You would have to know the Power Factor, normally designated PF. MVA x PF = MW. If the PF is unity then MVA = MW. A PF of UNITY suggest the load is purely resistive with neither capacitive nor inductive components in the load or source. Of course this can mean such components have been balanced artificially.
The power in a 15 MVA (15000 KVA) transformer depends on the power factor. You did not specify the power factor, so I will assume a power factor of 0.92. Simply multiply MVA by PF and you get 13.8 MW.
Transformers are rated in KVA or VA (volt-amps). They transform voltages from one value to another. The current in a transformer is inverse to the voltage. This is why transformers are rated in KVA and smaller ones in VA.
Large transformers are filled with oil which circulates to a radiator to get rid of excess heat. A 100 MVA transformer should waste about 1 MW of power on full load, 0.5 MW on no load.
There are two concerns here regarding loading on transformers of this size. First is the difference between MVA and MW. MW is just real power -- watts. MVA is total power which includes real power (MW) and reactive power (MVAR).--- http://en.allexperts.com/q/Electric-Power-Utilities-2405/operation-limit-oof-power.htm
A 132 kV substation is normally called a grid substation. It would normally use two or more 132/33 kV transformers rated at 90 MVA, or two or more 132/11 kV transformers rated at 30 MVA.
MVA is the apparent power. MVA=( MW+ MVAr)1/2
By definition, MVA is equivalent to the vector sum of MW and MVAR: MVA^2 = MW^2 + MVAR^2 = 2500 MVA = 50
You would have to know the Power Factor, normally designated PF. MVA x PF = MW. If the PF is unity then MVA = MW. A PF of UNITY suggest the load is purely resistive with neither capacitive nor inductive components in the load or source. Of course this can mean such components have been balanced artificially.
MVA= square root of (MW2 + MVAR2 )
In electrical engineering it can be millivolt amperes, unless it's MVA then it is megavolt amperes. Such as in the use of large transformers.
MVA(Mega volt ampere) is the cos component of MW. So one should know the power factor of the system for conversion from MVA to MW.
mw/mva=power factor reactive power(Q)=I2XL or E2/XL where XL= REACTANCE apparent power = square root of (MW2 + MVAR2 )
milli volt amperes, unless it's MVA then it is Mega Volt Amperes. Such as in the use of large transformers
Transmission transformers are rated in mva. Distribution transformers are rated in kva. Power transformers are measured in va. There are, of course, exceptions, but this is the normal nomenclature.Answer (for UK terminology)In the electricity supply industry, the name 'power transformer' is used to describe those transformers used in the transmission system (400/275/132-kV levels), while 'distribution transformers' are those used in the distribution system (33 & 11-kV and 400-230-V levels). Power transformers and primary-distribution transformers are rated in megavolt amperes (MV.A), while secondary-distribution transformers are rated in kilovolt amperes(kV.A).Note that 'mva', 'kva', and 'va' are incorrect symbols for 'megavolt ampere', 'kilovolt ampere', and 'volt ampere'. The correct symbols are shown in the above paragraph (except that the period, or full stop, should be placed above the line).However, I suspect your question is really asking why are transformers rated in (mega) volt amperes rather than in (mega) watts? The answer is simply that the load that a transformer can supply is determined by the product of the transformer's rated secondary voltage and rated secondary current -the product of which (for alternating current) is the volt ampere, not the watt.