By Ohm's Law Voltage = Current x Resistance
R = V / I = 120 / 12 = 10 Ohms
Assuming DC and resistive loads, resistance equals voltage across the load, divided by the current through it. In this case 120/10 or 12 ohms.
V = i*r v = 2 * 60 v= 120v
By Ohm's law, resistance is voltage divided by current, so the resistance of a light bulb can be measured by observing the voltage across it simultaneously with observing the current through it. Interestingly, the hot resistance is significantly different that the cold resistance, so measuring resistance with an ohmmeter will not give a meaningful resistance. This is because the resistance of a light bulb has a positive temperature coefficient. Take a typical 60 W 120V light bulb, for instance... Its cold resistance is about 16 Ohms. Calculate current and power at 120 V and you get 7.5 A and 900 W. The truth is that at 60 W, the bulb pulls 0.5 A and has a resistance of 240 Ohms.
Since power is volts time amps, the current in a 60W lamp connected to 120V is 0.5A. Since a lamp is a resistive load, there is no need to consider power factor and phase angle, so that simplifies the explanation. ======================== Assuming this is an incandescent or halogen lamp (using a filament to make the light) there is a trick here: the resistance of a lamp filament varies with temperature and does not follow Ohm's law. The resistance will be much lower, thus the current will be much higher when the filament is cold, when the lamp is first connected. As the filament heats up, the resistance increases until it gets to a steady operating point of 0.5A. For a halogen lamp, the operating temperature is about 2800-3400K, so the R at room temperature is about 16 times lower than when hot... so when connected, the current is about 8A but drops rapidly. The current could be even higher if the lamp is in a cold environment. Non-halogen lamps operate at a lower temperature and would have a lower initial current--about 5A. And this all assumes the lamp is rated for 120V. If it is a 12V/60W lamp, the filament will probably break and create an arc, which may draw a very large current.
Because they are "in-phase". In order to get 240v, you need two 120v Alternating Current lines that are 180° out of phase, that is, opposite phases. Only when one line is +120v and the other -120v will you see 240v between the wires.
Assuming DC and resistive loads, resistance equals voltage across the load, divided by the current through it. In this case 120/10 or 12 ohms.
Ohms law states that E=I * R, or voltage equals current times resistance. Therefore current equals voltage divided by resistance. 120v divided by 16 ohms equals 7.5 amps.
The formula you are looking for is V = IR where V = Voltage I = Current R = Resistance With some formula manipulation and numbers plugged in you get I = 120V / 9.6Ω I = 12.5A The kettle would have 12.5 volts of current running through it.
Power is measured in Watts, power (Watts) = E (volts) x I (current - amps) current is determined by the internal resistance (R) of the lightbulb, the lower the resistance the more current will flow. 120v x 0.5a = 60W 120V x 0.83a = 100W the 100W lightbulb will draw more current We also have Ohm's law: E(volts) = I (amps) x R (ohms) Household voltage stays the same at 120v we have for a 100w lamp: 120v = I x R R = 120v/0.83 amps R = 144.6 ohms for a 60w lamp: 120v = I x R R = 120v/0.5 amps R = 240 ohms The higher watt lamp has lower resistance.
Electrical
12
To find the resistance necessary, one would need to know how much current the bulb draws. If one knows the current the bulb draws, then one would subtract the 14 volts from 120 volts then divide that by the current the bulb draws and one will find the resistance needed. Once this has been done, one would need to multiply the current drawn by the voltage drop to get the wattage rating necessary. Another important detail to note is that the power dissipated by the resistor will be much greater than the power consumed by the bulb itself. Finally if the bulb burns out the voltage across the contacts will be 120V. I would not recommend using this method to drop the voltage for the bulb.
R = E / I = 120/2 = 60 ohms.
V = i*r v = 2 * 60 v= 120v
Resistance=Voltage2/Power =1202/1100 =13.1 (3sf)
"Volts" is electrical pressure applied to a circuit; whereas, "ohms" is electrical resistance to that pressure. One cannot determine ohms from voltage without knowing either the current (in "amps") or power (in "watts"). A normal 120V household circuit can handle a maximum of 20 amps, so using ohm's law of resistance = voltage / current, the minimum resistance required in a 120V household circuit would be 6 ohms. Any less than 6 ohms will cause the circuit breaker to trip.
By Ohm's law, resistance is voltage divided by current, so the resistance of a light bulb can be measured by observing the voltage across it simultaneously with observing the current through it. Interestingly, the hot resistance is significantly different that the cold resistance, so measuring resistance with an ohmmeter will not give a meaningful resistance. This is because the resistance of a light bulb has a positive temperature coefficient. Take a typical 60 W 120V light bulb, for instance... Its cold resistance is about 16 Ohms. Calculate current and power at 120 V and you get 7.5 A and 900 W. The truth is that at 60 W, the bulb pulls 0.5 A and has a resistance of 240 Ohms.