It would be pretty much undefined, since the filament of the halogen bulb would fail immediately then there would be an open circuit with no current draw.
<<>>
The formula for current is Amps = Watts/Volts. The lamp itself would draw 4.16 amps. Since the voltage of the lamp is 12 volts there is a internal transformer involved in the fixture itself. It doesn't matter what the input (primary) voltage to the transformer is, so long as it meets the manufacturer's specification as to the proper voltage to operate the fixture.
The formula for current is Amps = Watts/Volts. The lamp itself would draw 4.16 amps. Since the voltage of the lamp is 12 volts there is a internal transformer involved in the fixture itself. It doesn't matter what the input (primary) voltage to the transformer is, so long as it meets the manufacturer's specification as to the proper voltage to operate the fixture.
At 120 volts it will pull 4.166 amps. At 240 volts it will pull 2.08 amps.
The CT standard output is 5 amps at the rated input amps. The CT will have a marking like 400:5, 100:5, or similar, where the bigger number is the input current that will cause 5 amps to flow in the CT secondary. Divide the span by 5 to get the multiplier. For instance: CT - 400:5 400 / 5 = 80 So, if you measure, say, 3 amps from the CT, the primary current is: 3 * 80 = 240 A
The formula you are looking for is Amps = Watts divided by Volts. Once you find the amperage you can decide as to the size of a fuse you should use. Remember that fusing protects the conductors of the circuit and not the load of the circuit. A #14 wire is rated at 15 amps and can legally be loaded only to 12 amps.
If the 2 amps is the output amperage of the power supply, the maximum that should be drawn from the unit is 2 amps. The load amperage that is connected to the power supply should govern the amperage of the fuse used. There is not much range there, the fusing could go from .25 to 2 amps. If the input amperage is 2 amps then the input and output voltage of the power supply should be stated.
6,25AnswerWithout knowing the efficiency of the motor, it's impossible to tell. The horsepower rating of a motor describes its output power; you need to know its input power in order to calculate its current.
The formula for current is Amps = Watts/Volts. The lamp itself would draw 4.16 amps. Since the voltage of the lamp is 12 volts there is a internal transformer involved in the fixture itself. It doesn't matter what the input (primary) voltage to the transformer is, so long as it meets the manufacturer's specification as to the proper voltage to operate the fixture.
at 230v it will use 5 to 6 amps
4.16 Amps
3 amps
17amps
Typically single phase motors go up to 10hp. Wouldn't be very efficient at about 100 amps. A 20hp 3 phase motor at 230v pulls 52 amps. The 10hp single phase 230v pulls 50 amps.
At 120 volts it will pull 4.166 amps. At 240 volts it will pull 2.08 amps.
you divide the volts by the ohms
It depends on the voltage, if the voltage is 230v then the answer would be 6.95 Amps. Try the Watts2Amps App on the iPhone, that's what I use.
1.67 Amps
UK Mains is 230V therefore 6 KW is 6000/230 = 26 Amps. 3 phase is slightly different....... 6000/400V = 15 Amps/root 3 = 8.67 Amps per phase.