dear,
wattage = voltage x amperage
Amperage, also called current, is the amount of electrical energy flowing through an appliance at any given time. This measurement is expressed in units called amperes
vinayak hegde
If it is 1000 watts then it produces a 1000 watts. A watt is 1 joule/sec.
If your light bulb voltage rating is under 300 volts then yes it can use 300 volt wire. The voltage rating of the wire is the maximum voltage that the wire can safely carry. The three common insulation groups is 300 volts, 600 volts and 1000 volts.
60 and 100 watt in series then the 60 watt will have the biggest voltagedrop.In parallel they are the same.Using a voltage of 120 volts, the 60 watt lamp would have 75 volts across it and the 100 watt lamp would have 45 volts across it in a series circuit, bringing the total to 120 volts.
As 1 Watt equals to 1000 milli Watts so 25 milliwatts = 25/1000 = 0.025 Watts.
The amps drawn by a 65 watt light bulb should be 65/120 or 0.54167. This fraction of an ampere may be restated as 541.67 milli-amps.
You need at least 5 volts to power it? What's the real question?
Your question answers itself... 1000 watts, when operated on a 480-volt source..
If it is 1000 watts then it produces a 1000 watts. A watt is 1 joule/sec.
About 7 cents an hour.
A customer can purchase a 1000 watt grow light from a website called Greners. The cost to purchase a 1000 watt grow light from the Greners website is around an average of $500.00
If your light bulb voltage rating is under 300 volts then yes it can use 300 volt wire. The voltage rating of the wire is the maximum voltage that the wire can safely carry. The three common insulation groups is 300 volts, 600 volts and 1000 volts.
Unfortunately, the question as phrased is meaningless. A watt or kilowatt is a measure of voltage times current - one kilovolt at one amp of current dissipates one kilowatt of energy, but the same kilovolt at one tenth of an amp of current only dissipates 100 watts. Here's the formula: Watts = Volts * Amps
If running at 120 volts that is 8.33 ampsIf running at 120 volts that is 8.33 amps
60 and 100 watt in series then the 60 watt will have the biggest voltagedrop.In parallel they are the same.Using a voltage of 120 volts, the 60 watt lamp would have 75 volts across it and the 100 watt lamp would have 45 volts across it in a series circuit, bringing the total to 120 volts.
As 1 Watt equals to 1000 milli Watts so 25 milliwatts = 25/1000 = 0.025 Watts.
The amps drawn by a 65 watt light bulb should be 65/120 or 0.54167. This fraction of an ampere may be restated as 541.67 milli-amps.
Watt volts is not an electrical term. Watts are the product of amps times volts.