Current (I), measured in amperes (A) is equal to the true power (P) measured in watts (W) divided by the system voltage (V) measure in volts (V), in a single phase alternating current circuit.
I = P/V ohms law... eq 1
1 hp = 745.69W... eq 2
From your question the parameters given are:
V = 120 VAC
Horsepower = 1.5hp
Substituting in eq 2
1.5 hp * 745.69 = 1118.53W
Substituting in eq 1
I = 1118.53/ 120
I = 9.32A.
NB: among other factors that impact on the actual current drawn from any electrical apparatus all motors draw a higher current on start up which is several times higher than calculated values as stipulated above.
Hope this helps!
trinilink
AnswerAs the horsepower rating of a machine describes its outputpower, it will first be necessary to determine its input power. To do this, you must determine its efficiency before embarking on the method described above.23
Watts = Volts times Amps. Therefore, if the voltage was 220 volts, the motor would draw 500 amps. If the voltage was 4,000 volts, the motor would draw 27.5 amps. The voltages for large powerful motors tend to be relatively high, for example in the 380 Volts to 11,500 Volts range.
Depends on the voltage. If you are running off 120 VAC, a 0.5 horsepower motor would draw 3.1 A.
The electrical code states that a 30 HP induction motor at 460 volts three phase will draw 40 amps. <<>> I = 33.34 AMPS IF EFF.= 95% AND P.F.= 85%
T430.247 of the NEC shows that a 1 hp motor operating at full load on 115v will draw 16 amps, called Full Load Current (FLC). Conductors supplying this motor are required to be 125% of FLC which is 20 amps. Motor circuits are complicated things and do not follow the rules of other circuits. This motor, while drawing a maximum of 16 amps at full load and supplied with #12 AWG copper conductors can be protected by a breaker of 40 amps.
23
To answer this question the voltage of the motor must be stated.
Depends on how big the motor is. A stronger motor will draw more amps then a weaker or less efficient motor. For example a wiper motor draws far less then a starter motor.
Watts = Volts times Amps. Therefore, if the voltage was 220 volts, the motor would draw 500 amps. If the voltage was 4,000 volts, the motor would draw 27.5 amps. The voltages for large powerful motors tend to be relatively high, for example in the 380 Volts to 11,500 Volts range.
It depends on the voltage-- I think at 110v it's 4 amps per hp
To calculate the amps drawn by an 18kW motor, you can use the formula: Amps = Power (Watts) / Voltage (Volts). Assuming a standard voltage of 120V, the motor would draw approximately 150Amps. Note that actual amps will depend on the specific voltage of the motor.
Depends on the voltage. If you are running off 120 VAC, a 0.5 horsepower motor would draw 3.1 A.
For a 1hp 3-phase motor, the current draw will depend on the voltage supply. Typically, at 230V, a 1hp 3-phase motor will draw around 3.6 amps. However, this value may vary based on the motor efficiency and power factor.
read the name plate on the motor
A typical starter motor draws around 50 to 150 amps while cranking an engine. If the current draw is significantly higher or lower, it may indicate a problem with the starter motor or the electrical system.
Assuming the power factor is 1, a 10 hp motor operating at 600 volts in a three-phase system would draw approximately 13.33 amps.
Then you are trying to get more HP out of the motor that it can supply. Back off on the load that the motor is driving or put a bigger motor onto the load.