answersLogoWhite

0

The answer to this is a bit more complicated than you might think due to the strong positive temperature coefficient of resistance of the filament material.

When operating at its stable temperature (and voltage) this is easy to compute using Ohm's Law and the Power Formula, as follows:

  • Power Formula, I = P ÷ E and solving I = 4W ÷ 24V = 167mA
  • Ohm's Law, R = E ÷ I and solving R = 24V ÷ 167mA = 144Ω
However when cold at room temperature the resistance of the bulb may be less than this by more than a factor of 100, making a cold resistance of less than 1Ω quite likely. This results in a very high initial current surge (until the temperature rises) when the voltage is first applied, as follows:
  • Ohm's Law, I = E ÷ R and solving I = 24V ÷ 1Ω = 24A
  • Power Formula, P = IE and solving P = 24A * 24V = 576W
This current surge and the resulting excessively high temporary power dissipation is the cause of the frequent burnout of bulbs when they are turned on.

At intermediate temperatures the resistance will be somewhere in between these values.

It makes no difference if the bulb is powered by 24VDC or 24VAC. as AC is specified in RMS (root mean square) which is equivalent to DC. It's worth pointing out that, because the resistance of a tungsten lamp varies with variations in voltage, it is considered to be a 'non-linear' or 'non-ohmic' device, meaning that it does not obey Ohm's Law (unless the temperature is constant, but as every change in voltage changes the temperature and the temperature takes time to change following the voltage change making the actual resistance as a function of voltage very difficult to determine).

User Avatar

Wiki User

11y ago

What else can I help you with?

Continue Learning about Engineering

Can you use a 120 volt - 500 watt bulb in a 220 volt lamp?

no , it will burn out


The problems of electricity two light bulbs of 100 watt and one 60 watt both operation in a 220V circuit. Which bulb has the higher resistance and which bulb carries the greater current?

A 100 watt 220 volt light bulb (or anything consuming 100 watts on 220 volts) draws 100/220, or .45 Amps. It will also have about 220²/100, or 484 ohms resistance. A 60 watt 220 volt light bulb (or anything consuming 60 watts on 220 volts) draws 60/220, or .27 Amps. It will also have about 220²/60, or 807 ohms resistance.


Watt is the current allowed to flow through a bulb?

The current flowing through a bulb is equal to the (voltage across the bulb) divided by the (bulb resistance), and can be expressed in Amperes. The rate at which the bulb dissipates energy is equal to (voltage across the bulb) times (current through the bulb), and can be expressed in watts.


Why does the current in an electric circuit decrease when more bulbs are added?

If you add one extra bulb and the voltage remains constant, then you have doubled the current drained from the regulator. 12 Volt and One 12 Watt light bulb drains 1 Ampere Current. 12 Volt and Two 12 Watt light bulbs drains 2 Ampere Current. However: If having a 24 volt power source and you add two 12 Volt 12 Watt in serial, then you still only drain 1 Ampere Current. NOTE: Wattage and Voltage of bulbs may be different even if the sockets are the same. Lower voltage on the bulb will increase the current drain, if voltage is a lot lower it might cause the circuit delivering voltage to burn out or blow a fuse. It can also quickly burn the bulb, sometimes in a fraction of a second. It will however do little damage to add a bulb with higher voltage than the circuit is designed for. You will then only observe that you do not get the light you might hope for. Total Current/Ampere= Combined Wattage divided by Voltage Total Wattage = Combined Current or Ampere multiplied by Voltage. In simpler words: If you double the bulbs, twice the current is drained from the battery


How much current is needed to light a bulb with 180 ohms of resistance using 120 volt power source?

V = I x R so current I = 1/2 amp. I bet the bulb is rated at 60 W because Watts = Current x Voltage. Where V = voltage (volt) I = current (ampere) R = resistance (ohm) Your question isn't easy to answer. A lamp has two 'resistances': a 'cold' resistance, and a 'hot' resistance. Before it is energised, it is cold, so its resistance is low; when it is energised, it becomes very hot, and its resistance increases significantly. So, the question is whether your '240 ohms' is the cold resistance or the hot resistance. If it is the cold resistance, then a current of 0.5 A will flow through it for a fraction of a second, then its resistance will increase significantly, and the current will fall to a very much smaller value.

Related Questions

A 60-watt 120 volt lamp requires morecurrrent than a 40 watt 120 volt lamp?

Yes. A 60W bulb has a higher resistance than the 40W buld. The extra resistance requires more current to light up the bulb. The fillament then glows brighter.


Can you use a 130 volt 60 watt bulb in a socket that was made for a 110 volt 60 watt bulb?

yes the bulb will actually last longer


Can you use a 120 volt - 500 watt bulb in a 220 volt lamp?

no , it will burn out


Do a 12 volt 50 watt bulb and a 230 volt 50 watt bulb draw the same current?

No, they do not draw the same current. The current drawn by an electrical device is determined by the power (Watts) and voltage (Volts) using the formula: Current (amps) = Power (Watts) / Voltage (Volts). So, the 12 volt 50 watt bulb will draw higher current compared to the 230 volt 50 watt bulb.


The problems of electricity two light bulbs of 100 watt and one 60 watt both operation in a 220V circuit. Which bulb has the higher resistance and which bulb carries the greater current?

A 100 watt 220 volt light bulb (or anything consuming 100 watts on 220 volts) draws 100/220, or .45 Amps. It will also have about 220²/100, or 484 ohms resistance. A 60 watt 220 volt light bulb (or anything consuming 60 watts on 220 volts) draws 60/220, or .27 Amps. It will also have about 220²/60, or 807 ohms resistance.


Is 12 volt is enough for a 100 watt light bulb?

12 volts is enough for a 12-volt 100-watt light bulb. It would not be enough for a 120-volt or 240-volt bulb.


Why a bulb of 60 watt glow in 220 volt of ac?

the 220 volt bulp in 220 volt ac current


What is the resistance of a 40 watt bulb?

On this calculation I am assuming that the light bulb is using a 120 volt source. Watts = Amps x Volts. Amps = Watts/Volts, 40/120 = .33 amps. R = Volts/Amps, 120/.33 = 363.6 ohms resistance in the 40 watt light bulb.


Can you replace a 110 volt 35 watt halogen bulb with a 120 volt 35 watt bulb?

Yes, you can replace a 110 volt 35 watt halogen bulb with a 120 volt 35 watt bulb, as the wattage is the same. The slight difference in voltage (110V vs. 120V) generally won't affect performance, as most bulbs can operate within a range of voltages. However, ensure the fixture is rated for the wattage to avoid overheating. Always check the specifications of your fixture for compatibility.


What is the resistance of a 28 watt bulb?

That depends on what voltage it's designed to operate from. Power = (voltage)2 / R R = Voltage2 / power If it's a 117-volt bulb, R = (117)2 / 28 = 489 ohms. If it's a 240-volt bulb, R = (240)2 / 28 = 2,057 ohms.


How much electricity is really used when one 220 voltage 150 watts bulb being plugged in 110 volts supply?

The current through a 220 volt 150 watt bulb is I = W/E = .68 amps. The resistance of that bulb is R = E/I = 324 ohms. The wattage used by the 220 volt bulb when 110 volts is applied W = E(sqd)/R = 37 watts. Half the voltage with the same resistance will quarter the wattage output.


For bulbs connected in series the high watt bulb has high resistance?

No, the highest wattage bulb will have the lowest resistance.