For a resistive load like an incandescent light bulb Watts = Volts * Amps. In your home your voltage is fixed and may fluctuate somewhat due to a number of factors. The math is easy for a 60W bulb at 120 volts which yields a current of 1/2 (.5) Amps. The current for 75 W bulb at 120 V would be .625 A. You would need a 150VAC source to only draw .5 A.
different light bulb or (lamps) are designed to operate at different voltages what you need to do is read the spec of the lamp in question and so you know you couldn't have accurately calculated the drawn current without first knowing the operating voltage
45V
Power equals voltage times current in amps. The value of amps used is needed to do the math.
Find out your supply voltage, and divide 65 by it: I(amps) = P(watts)/V(volts) = 65/V
Yes. The current rating listed is the _maximum_ current that the power supply can provide without a drop in voltage.
The bulb is marked with the power (watts) and the voltage. Divide the watts by the volts and you have the amps.
One Megawatt = 1,000,000 watts. Watts = Volts x Amps or voltage x current. Hence if you know the voltage then Amps = 1,000,000 watts / Volts.
The voltage of a circuit with a resistance of 250 ohms and a current of 0.95 amps is 237.5 volts. Ohms's law: Voltage = Current times Resistance
Power equals voltage times current in amps. The value of amps used is needed to do the math.
Look on the light bulb for the voltage and the power in watts. Then divide the watts by the voltage and that gives the amps. Some CFL bulbs also state the current as well as the voltage and power, which is because they can have a poor power factor.
16 volts
Since power = current x voltage, you would divide the power (watts) by the voltage. The answer would be 1/10 amps or .10 amps.
current is the other factor. power (wattage) is the product of current (amps) and voltage
Ohm's law: Volts = amps times ohms In the case of a 4 ohm resistor with 1.5 amps of current, the voltage is 6 volts.
It depends on what the voltage is: A Watt is a unit of power described as "1 ampere of current pushed by 1 volt of Electro-motive force", therefore 1W = 1A x 1V. 1 kW is a "kilowatt" or 1000 Watts. Using the above formula, and a little algebra, you can find the current required to deliver any amount of power depending on the applied voltage: Current = Power / Voltage; symbolically expressed as I = P / V examples: * 9.5kW x (1000W/kW) / 120v = 79.2 amps * 9.5kW x (1000W/kW) / 240v = 39.6 amps * 9.5kW x (1000W/kW) / 480v = 19.8 amps (Notice how raising the voltage reduces the required current?)
voltage is measured in terms of volts ; current is measured in terms of amps.........................................
Find out your supply voltage, and divide 65 by it: I(amps) = P(watts)/V(volts) = 65/V
Current (Amps) = Power (Watt)/Voltage (V) Therefore a 4500W heating element will draw 18.75A = 4500W/240V
You would also need to know the current in amps. The formula you need is this: P = I V Power (in watts) = current (in amps) x voltage (in volts)