A transformer does not use, it transforms voltage from one value to another. The output amperage is governed by the connected load. If the load wattage is higher than the wattage rating of the transformer then either the primary or secondary fuse will blow or the transformer will burn up if the fusing is of the wrong sizing. The maximum primary amperage can be found by using the following equation, Amps = Watts/Volts, A = W/E = 600/120 = 5 amps. The same equation is used for the calculating the maximum secondary amperage, A = W/E = 600/12 = 50 amps.
Amps * Volts = Watts
1 Amp * 12 Volts = 12 Watts.
5 amps.
This is determined by using a variation of ohm's law.
Amps = Watts / Volts
or
5 Amps = 600 Watts / 120 Volts
600 watts @ 12 volts is 50 ampers
12volts at 180 watts to amps
5 amps
15
watts = volts * amps--> Amps = watts/ volts therefore; 2000/220= 9.09 amps
To find amps if watts and volts are known, use the formula; watts / volts = amps or 5000 / 240 = 20.83 amps
Different controllers have different outputs depending on how many valves are on each zone. In the device there is a control transformer. Look for the VA output of the transformer's secondary side. Mine states 20 VA at 24 volts. To find the amperage use the following equation. I = W/V. Amps = Watts or VA/Volts. Mine can output 20 divided by 24 = .83 amps. This amperage will be the maximum output in amps that the controller can produce to operate the zone valves. To find the current draw of the primary side of the transformer divide the transformers VA by 120 volts.
Only when the load is purely resistive.
It's the amps that are controlled by the breaker not the volts. You can have a 600 volt 15 amp breaker, you can have a 347 volt 15 amp breaker. The breaker will trip when you exceed 15 AMPS.
2.083 amps
The formula you are looking for is I = W/E. Amps = Watts/Volts.
you get a transformer...
On a 1kva you have 1000 watts capacity. To fine the current the formula is I = W/E. The secondary side of the transformer has the capacity of 1000/120 = 8.3 amps. In your question you do not put the amps across the secondary you draw amps from it. Using the transformer to its maximum, without overloading it, the primary will be 4.16 amps at 240 volts and the secondary will be 8.33 at 120 volts. <<>> voltage times amps equals wattage
watts = volts * amps--> Amps = watts/ volts therefore; 2000/220= 9.09 amps
Amps, volts and watts are interrelated, but you need to do a little math. Amps * Volts = Watts
I (Amps) = VA / E (Volts) I = 50 / 36 I = 1.39A Do the math!
It stands for 40 volt-amperes (Volts times amps) and is a measure of power. It is equivalent to watts for a resistive load.
Amps * Volts = Watts Amps * 12 = 600 600/12 = Amps = 50 amps You would need a reserve capacity, so I'd go somewhere between 60 or 100 Amp rated transformer. Transformers are rated in volt-amps which is usually calculated the same as watts. But the term "watts" technically does not apply to transformers. So you need a 600 volt-amp transformer or, as Redbeard has suggested, you need an 800 or 1000 volt-amp transformer. That's a lot of amps for a 12 volt system so I recommend you double check your requirements. You will need a #2 gauge wire if your requirements are correct.
Transformers are rated in KVA or VA (volt-amps). They transform voltages from one value to another. The current in a transformer is inverse to the voltage. This is why transformers are rated in KVA and smaller ones in VA.
To find amps if watts and volts are known, use the formula; watts / volts = amps or 5000 / 240 = 20.83 amps
Yes, this can be done. The adapter will handle at 120 volts x 15 amps = 1800 watts. The adapter is just a step up transformer. The maximum 220 volt device that can be connected can only have an amperage rating of 8 amps at 220 volts.