I am assuming that you are talking single phase. 45 kva is k = 1000, v = volts, a = amps. 45 kva is 45000 volt / amps. Input 45000 divided by 208 volts = 216 amps. Output 45000 divided by 120 volts = 375 amps. There are other losses in the transformer but as a general rule of thumb this is the calculation that you would use.
120 power flows through a circuit with 1 amp and 120 volts.
12 Amps x 12 volts = 1200 watts 1200 watts / by 120 volts = 10 amps at 120 volts Answer is 10 amp hours
W = VI (so long as they are in phase - ie a non-inductive load)
There is some equipment that will operate on 208 volts even though it is rated to be use 240 volts.See discuss question below.
Assuming this could be done with no conversion loss a 20 watt load at 120 volts would require about 1/6 of an amp. A 7 ampere hour battery would run the load for 6 x 7 = 42 hours. However, if you actually built a circuit to up convert 12 volts DC to 120 volts AC there would be significant conversion losses.
The voltage 208 is a three phase voltage. Single phase is classed as the voltage obtained from any two legs of the three phase voltage system. The voltage between L1 to L2 = 208. L2 to L3 = 208 volts and L3 to L1 = 208 volts. To measure the load of the 208 volt device just clamp an amp meter around one of the legs coming from the load. This will give you the amperage that the load draws.
120 power flows through a circuit with 1 amp and 120 volts.
12 Amps x 12 volts = 1200 watts 1200 watts / by 120 volts = 10 amps at 120 volts Answer is 10 amp hours
W = VI (so long as they are in phase - ie a non-inductive load)
It's not that simple. The basic formula is Volts / Ohms = Amps. For 30 Volts you'd get 0.5 Amps, for 60 Volts you'd get 1 Amp, for 120 Volts you'd get 2 Amps.
There is some equipment that will operate on 208 volts even though it is rated to be use 240 volts.See discuss question below.
Assuming this could be done with no conversion loss a 20 watt load at 120 volts would require about 1/6 of an amp. A 7 ampere hour battery would run the load for 6 x 7 = 42 hours. However, if you actually built a circuit to up convert 12 volts DC to 120 volts AC there would be significant conversion losses.
The formula to use is I = W/E, assuming that the single breaker is delivering 120 volts, the amperage to the circuit is 16.6 amps. A continuous load on a 20 amp breaker has to be reduced to 80% according to the electrical code. 20 x .8 = 16. So to answer the question, yes, a 20 amp breaker will support a 2000 watt load at 120 volts.
If you had a simple circuit of one hot wire (10 ohms), a load (100 ohms), and a neutral wire (10 ohms) at 120 volts; total resistance would be 120 ohms, divide 120 volts by 120 ohms = 1 amp (electrons), current stays the same in a series circuit, so 1 amp would flow through each part of the circuit, 1 amp times 10 ohms equals 10 volts dropped on each wire, 120 - (10 + 10) = 100 volts left for the load, 1 amp through the 100 ohm load proves this
Watts = Amps x Volts x Power Factor Maximum value of PF is 1 for a resistive load. If you have 120 volts and a PF = 1, then amps = 10.
It depends on the voltage. A load of 32 amps at 120 volts will be 3.84 kW. A load of 32 amps at 240 volts will be 7.68 kW. For any other voltage, multiply the voltage by 0.032. All these calculations assume a resistive (non-reactive) load.
Watts = Volts x Amps x Power Factor. An incandescent light bulb is a resistive load so PF = 1. ANSWER: = 1/2 Amp