answersLogoWhite

0


Best Answer

The apparent answer to the question would be (100 W)/(120 V) = 0.8333 A, assuming that, as a pure resistance load, the light bulb has a power factor close to 1.0.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: The current in a 100-W bulb connected to a 120-V source is?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Electrical Engineering

What is the total current if a 60 ohm 40 ohm and 20 ohm resistors are all three connected in series across a 120 VAC source?

I = E ÷ R = 120V ÷ (60Ω + 40Ω + 20Ω) = you figure it out now


A 120 V power supply connected to a 30 ohm resistor will produce how many amps of current?

A 120V power supply connected to a 30 Ohm resistor will produce 120/30 or 4 amps of current.


What is the current when a 60W lampis connected to 120v?

Since power is volts time amps, the current in a 60W lamp connected to 120V is 0.5A. Since a lamp is a resistive load, there is no need to consider power factor and phase angle, so that simplifies the explanation. ======================== Assuming this is an incandescent or halogen lamp (using a filament to make the light) there is a trick here: the resistance of a lamp filament varies with temperature and does not follow Ohm's law. The resistance will be much lower, thus the current will be much higher when the filament is cold, when the lamp is first connected. As the filament heats up, the resistance increases until it gets to a steady operating point of 0.5A. For a halogen lamp, the operating temperature is about 2800-3400K, so the R at room temperature is about 16 times lower than when hot... so when connected, the current is about 8A but drops rapidly. The current could be even higher if the lamp is in a cold environment. Non-halogen lamps operate at a lower temperature and would have a lower initial current--about 5A. And this all assumes the lamp is rated for 120V. If it is a 12V/60W lamp, the filament will probably break and create an arc, which may draw a very large current.


Why are you not getting 240v between the two hot wires?

Because they are "in-phase". In order to get 240v, you need two 120v Alternating Current lines that are 180° out of phase, that is, opposite phases. Only when one line is +120v and the other -120v will you see 240v between the wires.


Why do white wire mark 120v?

Because the white wire on a 120 volt circuit is the neutral wire that is connected to the silver screw on outlets and switches. It is connected to the neutral bar in the service panel.

Related questions

Why would a high watt lamp draw more current than a low watt lamp?

Power is measured in Watts, power (Watts) = E (volts) x I (current - amps) current is determined by the internal resistance (R) of the lightbulb, the lower the resistance the more current will flow. 120v x 0.5a = 60W 120V x 0.83a = 100W the 100W lightbulb will draw more current We also have Ohm's law: E(volts) = I (amps) x R (ohms) Household voltage stays the same at 120v we have for a 100w lamp: 120v = I x R R = 120v/0.83 amps R = 144.6 ohms for a 60w lamp: 120v = I x R R = 120v/0.5 amps R = 240 ohms The higher watt lamp has lower resistance.


How does the current in a light bulb connected to a 400 V source compare to the current when this light bulb is connected to a 60 V source?

Let's examine what it means when a bulb is 100W rather than 60W. I'm assuming that you meant to state that they are 120V bulbs being connected to a 240V circuit1. With the same voltage on each, and because power is voltage times current, the current must be greater in a 100W bulb than in a 60W bulb. Since a incandescent bulb is a linear load, if you double the voltage then you double the current2. So the current through the 100W bulb is still greater than through the 60W bulb. Or you may analyze it a bit more. With both on 120V, for more current to flow in the 100W bulb, the resistance of it must be less than that of the 60W bulb. So you may generalize that under any voltage (same voltage applied to each), the 100W bulb will always have more current through it than the 60W bulb. 1Actually, if they are 120V bulbs in a 240V circuit, there is a high probability that they will blow out. But before they do, this is what will happen. 2Well, slightly less than double, because the temperature coefficient on the filament is positive, so the hotter it is, the greater the resistance. Although this may seem nonlinear, a light bulb or other temperature sensitive resistive element is still defined as linear if over the short term it obeys Ohms law at any instant of the waveform. The current in the 100 watt bulb will be greater. Power is current times voltage, so current is power divided by voltage. Voltage is the same is both cases of this question, so current is proportional to power at 240V.


What is the resistance of metal conductor with a current of 2a and connected to a source providing potential difference of 120v?

R = E / I = 120/2 = 60 ohms.


What is the current of 120v dc source?

without knowing load, cannot say


Why a transformer designed for a 120V AC input will often burn out if connected to a 120V DC source?

All transformers are designed to work on AC. They do not work on DC.If you connect an inductor to DC, the current will increase until the capacity of the source or the conductance (1/resistance) capacity of the inductor and conductors is reached. Often, this condition will overheat and destroy the inductor, or destroy the source. A transformer is not an exception, as it is a form of inductor.


What is the total current if a 60 ohm 40 ohm and 20 ohm resistors are all three connected in series across a 120 VAC source?

I = E ÷ R = 120V ÷ (60Ω + 40Ω + 20Ω) = you figure it out now


A kettle is connected to a 120v outlet with a resistance of 9.6 ohms what is the current required to operate the kettle?

The formula you are looking for is V = IR where V = Voltage I = Current R = Resistance With some formula manipulation and numbers plugged in you get I = 120V / 9.6Ω I = 12.5A The kettle would have 12.5 volts of current running through it.


How much current flows in a 12 watt radio connected to a 120 volt supply?

Assuming that you mean connected to 120 Volts (V) supply, start with Power(P)=Current(I) x Voltage(V), and if P=V*I, then I=P/V. I=12W/120V=0.1A or 100mA and that is your current.


An electric heating element has a resistance of 16ohm and is connected to a voltage of 120v How much current will flow in the circuit?

Ohms law states that E=I * R, or voltage equals current times resistance. Therefore current equals voltage divided by resistance. 120v divided by 16 ohms equals 7.5 amps.


A switch is connected in a series with a 75-W bulb to a source of 120V. what is the potential difference across the switch when it is closed?

If the switch, light bulb, and source are all connected in series and the switch is ideal (has no resistance), then the switch acts as a short. There is no potential difference across the short.


If you have two light bulbs running on 120V circuit one is 50W the other 100W which has high resistence?

Power = voltage times current, and ohm's law state's R = V / I, so the lower wattage light bulb must have the higher resistance.


A 100-w light bulb is turned on it has an operating voltage of 120 V how much current flows through the bulb?

The formula you are looking for is I = W/E. Amps = Watts/Volts.