Current (I), measured in amperes (A) is equal to the true power (P) measured in watts (W) divided by the system voltage (V) measure in volts (V), in a single phase alternating current circuit.
I = P/V ohms law... eq 1
1 hp = 745.69W... eq 2
From your question the parameters given are:
V = 120 VAC
Horsepower = 1.5hp
Substituting in eq 2
1.5 hp * 745.69 = 1118.53W
Substituting in eq 1
I = 1118.53/ 120
I = 9.32A.
NB: among other factors that impact on the actual current drawn from any electrical apparatus all motors draw a higher current on start up which is several times higher than calculated values as stipulated above.
Hope this helps!
trinilink
AnswerAs the horsepower rating of a machine describes its outputpower, it will first be necessary to determine its input power. To do this, you must determine its efficiency before embarking on the method described above.don't no
23
Watts = Volts times Amps. Therefore, if the voltage was 220 volts, the motor would draw 500 amps. If the voltage was 4,000 volts, the motor would draw 27.5 amps. The voltages for large powerful motors tend to be relatively high, for example in the 380 Volts to 11,500 Volts range.
Depends on the voltage. If you are running off 120 VAC, a 0.5 horsepower motor would draw 3.1 A.
The electrical code states that a 30 HP induction motor at 460 volts three phase will draw 40 amps. <<>> I = 33.34 AMPS IF EFF.= 95% AND P.F.= 85%
T430.247 of the NEC shows that a 1 hp motor operating at full load on 115v will draw 16 amps, called Full Load Current (FLC). Conductors supplying this motor are required to be 125% of FLC which is 20 amps. Motor circuits are complicated things and do not follow the rules of other circuits. This motor, while drawing a maximum of 16 amps at full load and supplied with #12 AWG copper conductors can be protected by a breaker of 40 amps.
23
To answer this question the voltage of the motor must be stated.
Depends on how big the motor is. A stronger motor will draw more amps then a weaker or less efficient motor. For example a wiper motor draws far less then a starter motor.
It depends on the size and type of the motor being started.
Watts = Volts times Amps. Therefore, if the voltage was 220 volts, the motor would draw 500 amps. If the voltage was 4,000 volts, the motor would draw 27.5 amps. The voltages for large powerful motors tend to be relatively high, for example in the 380 Volts to 11,500 Volts range.
It depends on the voltage-- I think at 110v it's 4 amps per hp
Depends on the voltage. If you are running off 120 VAC, a 0.5 horsepower motor would draw 3.1 A.
The code book states that the motor will draw 1.8 amps. <<>> 1 amp
read the name plate on the motor
You asked the wrong question. You need to know how many amps the motor uses. Then you can multiply amps times volts and get watts. Then you can multiply watts by hours and get watt hours. (For house electricity you pay for kilowatt hours.) A kilowatt is 1,000 watts.
Then you are trying to get more HP out of the motor that it can supply. Back off on the load that the motor is driving or put a bigger motor onto the load.
The electrical code states that a 30 HP induction motor at 460 volts three phase will draw 40 amps. <<>> I = 33.34 AMPS IF EFF.= 95% AND P.F.= 85%