The apparent answer to the question would be (100 W)/(120 V) = 0.8333 A, assuming that, as a pure resistance load, the light bulb has a power factor close to 1.0.
I = E ÷ R = 120V ÷ (60Ω + 40Ω + 20Ω) = you figure it out now
A 120V power supply connected to a 30 Ohm resistor will produce 120/30 or 4 amps of current.
Since power is volts time amps, the current in a 60W lamp connected to 120V is 0.5A. Since a lamp is a resistive load, there is no need to consider power factor and phase angle, so that simplifies the explanation. ======================== Assuming this is an incandescent or halogen lamp (using a filament to make the light) there is a trick here: the resistance of a lamp filament varies with temperature and does not follow Ohm's law. The resistance will be much lower, thus the current will be much higher when the filament is cold, when the lamp is first connected. As the filament heats up, the resistance increases until it gets to a steady operating point of 0.5A. For a halogen lamp, the operating temperature is about 2800-3400K, so the R at room temperature is about 16 times lower than when hot... so when connected, the current is about 8A but drops rapidly. The current could be even higher if the lamp is in a cold environment. Non-halogen lamps operate at a lower temperature and would have a lower initial current--about 5A. And this all assumes the lamp is rated for 120V. If it is a 12V/60W lamp, the filament will probably break and create an arc, which may draw a very large current.
Because they are "in-phase". In order to get 240v, you need two 120v Alternating Current lines that are 180° out of phase, that is, opposite phases. Only when one line is +120v and the other -120v will you see 240v between the wires.
Because the white wire on a 120 volt circuit is the neutral wire that is connected to the silver screw on outlets and switches. It is connected to the neutral bar in the service panel.
Power is measured in Watts, power (Watts) = E (volts) x I (current - amps) current is determined by the internal resistance (R) of the lightbulb, the lower the resistance the more current will flow. 120v x 0.5a = 60W 120V x 0.83a = 100W the 100W lightbulb will draw more current We also have Ohm's law: E(volts) = I (amps) x R (ohms) Household voltage stays the same at 120v we have for a 100w lamp: 120v = I x R R = 120v/0.83 amps R = 144.6 ohms for a 60w lamp: 120v = I x R R = 120v/0.5 amps R = 240 ohms The higher watt lamp has lower resistance.
Let's examine what it means when a bulb is 100W rather than 60W. I'm assuming that you meant to state that they are 120V bulbs being connected to a 240V circuit1. With the same voltage on each, and because power is voltage times current, the current must be greater in a 100W bulb than in a 60W bulb. Since a incandescent bulb is a linear load, if you double the voltage then you double the current2. So the current through the 100W bulb is still greater than through the 60W bulb. Or you may analyze it a bit more. With both on 120V, for more current to flow in the 100W bulb, the resistance of it must be less than that of the 60W bulb. So you may generalize that under any voltage (same voltage applied to each), the 100W bulb will always have more current through it than the 60W bulb. 1Actually, if they are 120V bulbs in a 240V circuit, there is a high probability that they will blow out. But before they do, this is what will happen. 2Well, slightly less than double, because the temperature coefficient on the filament is positive, so the hotter it is, the greater the resistance. Although this may seem nonlinear, a light bulb or other temperature sensitive resistive element is still defined as linear if over the short term it obeys Ohms law at any instant of the waveform. The current in the 100 watt bulb will be greater. Power is current times voltage, so current is power divided by voltage. Voltage is the same is both cases of this question, so current is proportional to power at 240V.
R = E / I = 120/2 = 60 ohms.
without knowing load, cannot say
All transformers are designed to work on AC. They do not work on DC.If you connect an inductor to DC, the current will increase until the capacity of the source or the conductance (1/resistance) capacity of the inductor and conductors is reached. Often, this condition will overheat and destroy the inductor, or destroy the source. A transformer is not an exception, as it is a form of inductor.
I = E ÷ R = 120V ÷ (60Ω + 40Ω + 20Ω) = you figure it out now
The formula you are looking for is V = IR where V = Voltage I = Current R = Resistance With some formula manipulation and numbers plugged in you get I = 120V / 9.6Ω I = 12.5A The kettle would have 12.5 volts of current running through it.
Assuming that you mean connected to 120 Volts (V) supply, start with Power(P)=Current(I) x Voltage(V), and if P=V*I, then I=P/V. I=12W/120V=0.1A or 100mA and that is your current.
Ohms law states that E=I * R, or voltage equals current times resistance. Therefore current equals voltage divided by resistance. 120v divided by 16 ohms equals 7.5 amps.
If the switch, light bulb, and source are all connected in series and the switch is ideal (has no resistance), then the switch acts as a short. There is no potential difference across the short.
Power = voltage times current, and ohm's law state's R = V / I, so the lower wattage light bulb must have the higher resistance.
The formula you are looking for is I = W/E. Amps = Watts/Volts.