The answer to this is a bit more complicated than you might think due to the strong positive temperature coefficient of resistance of the filament material.
When operating at its stable temperature (and voltage) this is easy to compute using Ohm's Law and the Power Formula, as follows:
At intermediate temperatures the resistance will be somewhere in between these values.
It makes no difference if the bulb is powered by 24VDC or 24VAC. as AC is specified in RMS (root mean square) which is equivalent to DC. It's worth pointing out that, because the resistance of a tungsten lamp varies with variations in voltage, it is considered to be a 'non-linear' or 'non-ohmic' device, meaning that it does not obey Ohm's Law (unless the temperature is constant, but as every change in voltage changes the temperature and the temperature takes time to change following the voltage change making the actual resistance as a function of voltage very difficult to determine).
no , it will burn out
A 100 watt 220 volt light bulb (or anything consuming 100 watts on 220 volts) draws 100/220, or .45 Amps. It will also have about 220²/100, or 484 ohms resistance. A 60 watt 220 volt light bulb (or anything consuming 60 watts on 220 volts) draws 60/220, or .27 Amps. It will also have about 220²/60, or 807 ohms resistance.
The current flowing through a bulb is equal to the (voltage across the bulb) divided by the (bulb resistance), and can be expressed in Amperes. The rate at which the bulb dissipates energy is equal to (voltage across the bulb) times (current through the bulb), and can be expressed in watts.
V = I x R so current I = 1/2 amp. I bet the bulb is rated at 60 W because Watts = Current x Voltage. Where V = voltage (volt) I = current (ampere) R = resistance (ohm) Your question isn't easy to answer. A lamp has two 'resistances': a 'cold' resistance, and a 'hot' resistance. Before it is energised, it is cold, so its resistance is low; when it is energised, it becomes very hot, and its resistance increases significantly. So, the question is whether your '240 ohms' is the cold resistance or the hot resistance. If it is the cold resistance, then a current of 0.5 A will flow through it for a fraction of a second, then its resistance will increase significantly, and the current will fall to a very much smaller value.
The electrical resistance of a light bulb increases when it is turned on As a resistor, the tungsten light bulb has a positive resistance coefficient. This means that the electrical resistance goes up when the filament becomes hot. For example, a 100 watt light bulb operated at 120 volts - it does not matter if it is AC or DC for this calculation - will have a resistance of 144 ohms when hot and draw .833 ampere. When cold the filament typically has a resistance of only 10 ohms which increases as the filament heats up.
Yes. A 60W bulb has a higher resistance than the 40W buld. The extra resistance requires more current to light up the bulb. The fillament then glows brighter.
If it is an 18 watt 12 volt bulb, then yes. But an 18 watt 120 volt bulb - then no.
yes the bulb will actually last longer
On this calculation I am assuming that the light bulb is using a 120 volt source. Watts = Amps x Volts. Amps = Watts/Volts, 40/120 = .33 amps. R = Volts/Amps, 120/.33 = 363.6 ohms resistance in the 40 watt light bulb.
A 50 watt bulb designed to run on 12 volts takes 4.17 amps. A 50 watt bulb designed to run on 230 volts takes 0.217 amps.
120 volts.
no , it will burn out
A 100 watt 220 volt light bulb (or anything consuming 100 watts on 220 volts) draws 100/220, or .45 Amps. It will also have about 220²/100, or 484 ohms resistance. A 60 watt 220 volt light bulb (or anything consuming 60 watts on 220 volts) draws 60/220, or .27 Amps. It will also have about 220²/60, or 807 ohms resistance.
Mine has a 25 watt 120 volt bulb in it.
No, the highest wattage bulb will have the lowest resistance.
12 volts is enough for a 12-volt 100-watt light bulb. It would not be enough for a 120-volt or 240-volt bulb.
the 220 volt bulp in 220 volt ac current