1 horsepower is equal to 0.7456 kWatts, therefore 1.1184 kWatts will be the power consumption of that motor.
The current draw depends on the voltage, simply use the P=UxI or I=P/U formulas to figure out the current draw (P=1118.4 Watts).
some voltages:
12Vdc draw 93.2Amps
24Vdc draws 46.6Amps
110Vdc draws 10.17Amps
230Vdc draws 4.86Amps
A 1 hp motor draws around 750 watts just divide it by 12 volts = Amps.... so its 62.5 amps
Motors do not put out amperage. A motor draws amperage as a load on an electrical system. A generator is used to produce voltage and amperage governed by the load connected to the generator.
111
40 amps
To answer this question the voltage of the motor must be stated.
The wiring should allow for 115 amps.
A 15 amp circuit breaker should trip at 15 amps regardless of the load voltages or impedances. If you have 277 volts and 7 ohms, the current would be 39.5 amps and a 15 amp circuit breaker should trip.
The formula you are looking for is I = kW x 1000/1.73 x E x pf. I = 1.5 x 1000 = 1500. 1500/1.73 x 400 x .86 = 1500/595 = 2.5 amps. A standard motor's efficiency between 5 to 100 HP is .84 to .91. A standard motor's power factor between 10 to 100 HP is .86 to .92. A feeder for a motor has to be rated at 125% of the motors full load amps 2.5 x 125% = 3.1 amps A #14 copper conductor with an insulation rating of 90 degrees C is rated at 15 amps.
At 600 volts the rule of thumb is one amp per horsepower. So a 20 HP motor would need 20 amp wire. The code book states that a 20 HP motor at 575 volts draws 22 amps. The conductor for a motor has to be 125% rating of the motors FLA (full load amps). 22 x 125% = 27.5 amps. A #10 copper conductor with a insulation factor of 60,75 or 90 degrees C is rated at 30 amps.
To answer this question the voltage of the motor must be stated.
The wiring should allow for 115 amps.
It's the amps that are controlled by the breaker not the volts. You can have a 600 volt 15 amp breaker, you can have a 347 volt 15 amp breaker. The breaker will trip when you exceed 15 AMPS.
I have a single phase induction motor. It draws 8 amps on start up and climbs to 14-15 amps when I put a load on it. When I don't have a load it runs at 1 and climbs to 2-3 amps. It is normal operation for this motor to run at the lower number of amps with a load. But I don't know what is wrong.
15 Amps
15 amps
It all depends upon what is "ON" at the time. A running vehicle with only the ignition system operating uses about 10-15 amps. Turn on the headlights and it will draw an additional 10 amps. With everything on, it can draw from 30-100 amps. Do you mean "what is the alternator rated at?" Your Jeep alternator would be normally rated at about 100 amps output.
Depends on the amp draw of the items. Most home plugs in the U.S. can safely carry 15 amps.
Presuming single phase and 1500 rpm. Normal running current would nearly 10 Amps but varies considerably. A real cheap one running at 3000/3600 rpm could exceed 12 amps. At 240 volt, all should be run off 15 Amp line.
For a single phase 3 HP motor at 208 volts the amperage is 18.7 amps. For a three phase 3 HP motor at 208 volts the amperage is 10.5 amps. This figure is derived at by taking the full load amps at 230 volts and adding 10%. As the voltage goes down the amperage goes up. For 200 volt motors 15% is added to the FLA of a 230 volt motor.
15 amps at 80% = 12 amps continuous. Watts = Amps x Volts.