I assume this is for an AC system. 110-120Vrms (typical wall socket voltage) = 155-170V sine wave peak to 0 volts. Similar for the current you gave - I assume it is in RMS?
In an AC system, 120Vrms * .5Arms = 60Watts
If the values you have are measured peak to 0 volts, Wattage = 1/2 * Voltage * Current
100w V = IR I seem to remember from school days...
120 ohms of resistance, using ohm's law
*I=V/R
*Current (I) is equal to Voltage (V) divided by Resistance (R).
If it is incandescent the current will be 1 amp. I = W/E
The watts equal volts times amps, so 0.8 x 120 is 96 watts.
I assume you know this version of the formula for power.
Power = Potential difference x Current
So,
Power = 120 x 0.5 = 60 Watts
The answer is volts times amps, 120 x 0.5 so you have to use a calculator (except for those who have skills in mental arithmetic).
If it is a bulb designed to work at 120 v the current is 150 divided by 120 and the answer is in amps.
for a 12 v bulb, 12 volts x 0.5 amps = 6 Watts
for a 240 v bulb, 240 volts x 0.5 amps = 120 Watts.
The formula you are looking for is W = I x E. W = Amps x Volts.
240 ohms
Power = Potential Difference (Voltage) x Current So in this case, Power = 6 x 0.5 = 3 watts
The power used, assuming Unity Power Factor (resistive load), is the product of resistance and the square of the current -- or 1210 Watts.
12 Amperes is.
The formula you are looking for is W = A x V. Watts = Amps x Volts.
A standard 3kW immersion heater will require a fuse rating of 13 A. This is because, it draws a current of 12 A.
All depends what voltage is applied across it.Assuming it's designed for operation at 120 VAC, then it draws 33-1/3 mA when it's turned on.(120 x 0.0333) = 4
Power is VI so 360 watts.
Power = Voltage * Current, so 120V x 9.5A = 1140 Watts. What the hell VCR draws 1140 Watts, though? Even early Sony U-Matic VCR's with mechanical everythings drew only 100 Watts.
The amperage output on an adapter is the rating applied by the design manufacturer. Connecting a load that draws more that the design limit of the adapter will damage the adapter. As long as your connected load stays under the adapters rating there is no problem.
An ups consists of battries as the main components .Batteries store charge (charge=current*time)... hence its units is so. ex. if its rating is 6AH and the load connected to in draws 1A, then the batter lasts for 1 hr.
To find the resistance necessary, one would need to know how much current the bulb draws. If one knows the current the bulb draws, then one would subtract the 14 volts from 120 volts then divide that by the current the bulb draws and one will find the resistance needed. Once this has been done, one would need to multiply the current drawn by the voltage drop to get the wattage rating necessary. Another important detail to note is that the power dissipated by the resistor will be much greater than the power consumed by the bulb itself. Finally if the bulb burns out the voltage across the contacts will be 120V. I would not recommend using this method to drop the voltage for the bulb.
The voltage across a DC device that draws 2A and consumes 12Wh/h is 12/2 or six volts.
1.7amp
Current equals power divided by voltage, so with 110 V across the load, a 900 W system draws about 8.18 A (if the voltage potential is double, 220 V, the current is half, 4.09 A).
Use a generator with a high enough rating to power the house, of course. Trying to power a house that draws 60A of current with a 10A generator is just never going to work.
Power = Voltage x Current P=V.I Power (in Watts) = 110V x 8.70A = 957W (Appx. 1kW) - Neeraj Sharma