Load factor and current are not directly related
In a.c. circuits, the watt is used to measure the true power of a load, and is determined by multiplying the supply voltage by the load current by the power-factor of the load. The volt ampere is used to measure the apparent power of a load, and is determined by multiplying the supply voltage by the load current. So the relationship between the watt and the volt ampere depends on the power factor of the load. For example a 100 VA load with a power factor of 0.8 (leading or lagging) will have a true power of 80 W.
120Watts=1Amp
Watts = Amps x Volts x Power Factor Power factor varies from 0 to 1 with 1 being a pure resistive load like a light bulb. A motor would have a lesser value. So if your load is resistive just use 1 x 440.
Four of them
yes because u can divie it into many factor that equal the same number
Watts = Amps x Voltage x Power Factor Hence to compute watts you need to know voltage and power factor. If you have a pure resistive load like a light bulb power factor = 1 and can thus be ignored. If you are asking about residential power, the voltage is 120 VAC so the computation is now trivial.
It depends on the power factor. For a typical power factor of 0.92, 75kva would be equal to 69kw, which would be equal to 9.25 horsepower (electric motor).
130.5 miles. factor is about 1.6
At 80% load factor you can support 60 fixtures.
If the power factor is 1, i.e. a resistive load, the 1 KVA is 1 KW. If the power factor is less than 1, i.e. a reactive load, then multiply PF by KVA to get KW. For example, if PF is 0.92 and KVA is 1, then KW is 0.92.
2
Watts is equal to volts x current x Power Factor. The maximum value of Power Factor is 1 for a resistive load. For motors and other inductive devices the Power Factor is less than 1. Your maximum wattage is 10,000 watts and decreases as Power Factor decreases.