Want this question answered?
A 100 watt 220 volt light bulb (or anything consuming 100 watts on 220 volts) draws 100/220, or .45 Amps. It will also have about 220²/100, or 484 ohms resistance. A 60 watt 220 volt light bulb (or anything consuming 60 watts on 220 volts) draws 60/220, or .27 Amps. It will also have about 220²/60, or 807 ohms resistance.
12 volts is enough for a 12-volt 100-watt light bulb. It would not be enough for a 120-volt or 240-volt bulb.
A 120 volt table lamp with a 75 watt bulb will pull 0.625 amps. With a 100 watt bulb it will pull 0.833 amps. And with a modern fluorescent 13 watt bulb it will pull 0.108 amps.
because it has more watts
16 AWG is plenty large enough for a 50 or 100 watt lamp.
800 watthours, = 0,8 kWH, over here about 10-20 eurocents I guess.
Divide the power rating (in watts) by the voltage (in volts). So if you use a 100 watt light bulb in a typical 110 volt lamp then it will draw 100/110 = .91 amps of current. Or plug a 1500 watt electric heater into a 110 volt wall socket and it will draw 1500/110 = 13.6 amps of current.
no.
'Lighting', or the amount of light is not measured in watts. A 'watt' is a unit of power, measurement of current drawn. Most 100 watt/110 volt lamps initially produce 1690 lumens (a 'lumen' is the measure of the total "amount" of visible light emitted by a source). A 15 foot by 10 foot room is not very big, and one or two 100 watt incandescent lamps on a ceiling light fixture on a 120 volt service can light it adequately. I mention "110 volt" as in parts of the world with 220 mains these numbers change. As the voltage is doubled, the watts required would halve for the same amount of light. In other words, all other things being equal, a 50 watt 220 volt lamp should use the same amount of power, producing the same amount of light as a 100 watt 110 volt lamp.
About 100 Joules per second.
17
In a floor lamp or table lamp a #16 wire is what you need.