Assuming a resistive load, the continuous current flowing would be 600/220 = 1.36 amps. The resistance of the load is 220/1.36 = 162 ohms.
If you have a 200 ampere hour battery that only supplies 24 volts you can't run your 600 watt device that is designed to run at 220 volts.
For sake of argument, say your load is an incandescent light bulb designed to work at 24 volts. If you attached the battery it would try and draw 600/24 = 25 amps and the resistance of the load would be about 1 ohm.
You need to match the voltage source to the load requirements.
CAVEAT - This example assumes that if a 24 volt battery was used that the 600 watt device was made to work for 24 volts. It is not the same load that would be for a 600 watt device at 220 volts. The problem is that the hypothetical question asked does not match reality.
Watts are the product of amps times volts. The amperage in a circuit is governed by the resistance of the load. A battery just supplies the potential as voltage, the load determines how much current is going to be drawn out of the battery. Batteries are rated in amp/hours. This means how long can a battery maintain a specific amperage over a period of time.
Only when the load is purely resistive.
You need to have the amperage to determine how many volts you get out of 20 watts.
Electric power = Volts X Amps, so 7 vols at 1 Amp will produce 7 watts 7 volts at 5 amps will produce 35 watts 7 volts at 15 amps will produce 105 watts and so on. Technically, there is not enough information (just volts) to answer your question but if you know the Amps, you can now figure the answer yourself.
To determine the number of 12-volt batteries needed to produce 1000 watts, you need to know the capacity or the amp-hour rating of the batteries. Divide the power (1000 watts) by the voltage (12 volts) to find the current (Amps) required. Then, divide the required current by the amp-hour rating of each battery to determine the number of batteries needed.
Watts are the product of amps times volts. The amperage in a circuit is governed by the resistance of the load. A battery just supplies the potential as voltage, the load determines how much current is going to be drawn out of the battery. Batteries are rated in amp/hours. This means how long can a battery maintain a specific amperage over a period of time.
An ampere-hour rating is a relatavistic indication of how long a battery can supply a specific current.It is not possible to determine the run time when you only gave watts, but watts are volts times amps, and you did not supply the volts.
Only when the load is purely resistive.
Depends on the voltage. Volts x Amps = Watts
Watts = Amps * Volts Watts = 20 amps * 100 Volts Watts = 2000 2,000 Watts or 2k Watts
Watts = Amps * Volts Watts = 20 amps * 100 Volts Watts = 2000 2,000 Watts or 2k Watts
You need to have the amperage to determine how many volts you get out of 20 watts.
Electric power = Volts X Amps, so 7 vols at 1 Amp will produce 7 watts 7 volts at 5 amps will produce 35 watts 7 volts at 15 amps will produce 105 watts and so on. Technically, there is not enough information (just volts) to answer your question but if you know the Amps, you can now figure the answer yourself.
If the wattage of a load is known then the current can be calculated. Watts equals amps times volts. You would use the following formula, Amps = Watts/Volts.
To determine the number of 12-volt batteries needed to produce 1000 watts, you need to know the capacity or the amp-hour rating of the batteries. Divide the power (1000 watts) by the voltage (12 volts) to find the current (Amps) required. Then, divide the required current by the amp-hour rating of each battery to determine the number of batteries needed.
1000 milliamperes = 1 amp. Assuming a resistive load, amps = watts / volts = .125 amps or 125 milliamperes
Volts cause current to flow through the load. The current is measured in amps, and the volts multiplied by the amps gives the power in watts.