The rating of 3A and 220V is the power rating. It's a bit strange that it's not being expressed in Watts, but it's not hard to figure out. Power is the measure of energy over time being passed through the circuit. Without getting into the particulars, you can express power as P=VI, where P is power in watts, V is voltage in volts, and I is current in amps. So in your case, power is equal to P=(220V)(3A), which equals P=660W. So if we take this and plug it into our new equation where V is doubled, we get (220V)(3A)=(440V)(I) where I is the amperage rating we're looking for. Algebra tells us that this equation now equals I=(660W)/(440V), which equals I=1.5A. So your answer is 1.5A. Basically all you have to remember is that as long as power is constant, voltage and amperage are inversely related. If you double volts, you have to halve amps.
<<>>
While it's true that a fan designed for 440 volts would use half the current of one designed for 220 v if they both use the same power, it is not possible to take a 220 v fan and plug it in to 440 v, which would definitely cause overheating and maybe a blown fuse or possibly a fire.
The terminal strip's rating is 15 amps at 600 volts. It does not matter what the voltage is up to 600 volts, the maximum amperage allowed on the strip is 15 amps. It could be 15 amps at 12 volts or 15 amps at 600 volts or any voltage in between.
A #8 wire with an insulation rating of 75 or 90 degrees C is rated at 45 amps.
When you multiply amps x volts the product is watts. Using this formula W = Amps x Volts should give you your answer.
Breakers and other electrical equipment are rated in voltage so you know what voltage they can withstand. The breaker could fail and start a fire if you apply a higher voltage than it is rated for. Breakers are actually rated in amps (current in excess of this will trip it), interrupting rating (how many amps the breaker can handle during a short circuit) and in volts (is the difference of potential or the pressure that is pushing the current). The breaker I am looking at is rated for 15 amps, 10,000 amps interrupting rating and 120/240 volts.
15 amps 120 volts AC
A #3 copper wire with an insulation rating of 90 degree C has the capacity to receive 105 amps. This is the most common or standard insulation that most calculations are based on. It is the insulation that governs the rating of the voltage. House wiring cables are insulation rated at 300 volts. Most other wiring insulation is rated at 600 volts. Special wires have a insulation factor of 1000 volts. The higher the insulation temperature factor is the higher the rating of current through the wire becomes. #3 at 60C is 55 amps, at 75C 65 amps, at 90C 105 amps, at 110C 120 amps, at 125C 130 amps, and at 200C 145 amps.
If the load is rated 220 or 230 or 240 volts (all the same thing), and specifies a wattage rating, that wattage rating is at the double pole voltage. You wouldn't make any adjustments to it. The load should also indicate the input current in Amps, which is used for sizing your breaker. If not, watts=amps x volts, so amps=watts/volts. For example, a 3800 Watt heater at 240v would require 15.83 amps. The breaker for this circuit would be 125 percent of that (safety factor), or 20 amps, 2 pole.
30 amps.
No. Your power supply must be able to supply rated voltage (12 volts) and rated current (3 amps).
15 amps 120 volts AC
Watts divided by volts = amps
The formula you are looking for is Watts = Amps x Volts. Amps = Watts/Volts. This comes to 4 amps load. Minimum size fuse would be 5 amps.