answersLogoWhite

0


Best Answer

By Ohm's Law Voltage = Current x Resistance

R = V / I = 120 / 12 = 10 Ohms

User Avatar

Wiki User

11y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is the resistance of a clothes iron that draws a current of 12A at 120V?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Electrical Engineering

What is the resistance of a circuit with 120V and 10A?

Assuming DC and resistive loads, resistance equals voltage across the load, divided by the current through it. In this case 120/10 or 12 ohms.


How much voltage does a line with resistance of 10 ohms and a current of 20 amps?

V = i*r v = 2 * 60 v= 120v


What is the resistance of a lightbulb using Ohm's law?

By Ohm's law, resistance is voltage divided by current, so the resistance of a light bulb can be measured by observing the voltage across it simultaneously with observing the current through it. Interestingly, the hot resistance is significantly different that the cold resistance, so measuring resistance with an ohmmeter will not give a meaningful resistance. This is because the resistance of a light bulb has a positive temperature coefficient. Take a typical 60 W 120V light bulb, for instance... Its cold resistance is about 16 Ohms. Calculate current and power at 120 V and you get 7.5 A and 900 W. The truth is that at 60 W, the bulb pulls 0.5 A and has a resistance of 240 Ohms.


What is the current when a 60W lampis connected to 120v?

Since power is volts time amps, the current in a 60W lamp connected to 120V is 0.5A. Since a lamp is a resistive load, there is no need to consider power factor and phase angle, so that simplifies the explanation. ======================== Assuming this is an incandescent or halogen lamp (using a filament to make the light) there is a trick here: the resistance of a lamp filament varies with temperature and does not follow Ohm's law. The resistance will be much lower, thus the current will be much higher when the filament is cold, when the lamp is first connected. As the filament heats up, the resistance increases until it gets to a steady operating point of 0.5A. For a halogen lamp, the operating temperature is about 2800-3400K, so the R at room temperature is about 16 times lower than when hot... so when connected, the current is about 8A but drops rapidly. The current could be even higher if the lamp is in a cold environment. Non-halogen lamps operate at a lower temperature and would have a lower initial current--about 5A. And this all assumes the lamp is rated for 120V. If it is a 12V/60W lamp, the filament will probably break and create an arc, which may draw a very large current.


Why are you not getting 240v between the two hot wires?

Because they are "in-phase". In order to get 240v, you need two 120v Alternating Current lines that are 180° out of phase, that is, opposite phases. Only when one line is +120v and the other -120v will you see 240v between the wires.

Related questions

What is the resistance of a circuit with 120V and 10A?

Assuming DC and resistive loads, resistance equals voltage across the load, divided by the current through it. In this case 120/10 or 12 ohms.


An electric heating element has a resistance of 16ohm and is connected to a voltage of 120v How much current will flow in the circuit?

Ohms law states that E=I * R, or voltage equals current times resistance. Therefore current equals voltage divided by resistance. 120v divided by 16 ohms equals 7.5 amps.


A kettle is connected to a 120v outlet with a resistance of 9.6 ohms what is the current required to operate the kettle?

The formula you are looking for is V = IR where V = Voltage I = Current R = Resistance With some formula manipulation and numbers plugged in you get I = 120V / 9.6Ω I = 12.5A The kettle would have 12.5 volts of current running through it.


Why would a high watt lamp draw more current than a low watt lamp?

Power is measured in Watts, power (Watts) = E (volts) x I (current - amps) current is determined by the internal resistance (R) of the lightbulb, the lower the resistance the more current will flow. 120v x 0.5a = 60W 120V x 0.83a = 100W the 100W lightbulb will draw more current We also have Ohm's law: E(volts) = I (amps) x R (ohms) Household voltage stays the same at 120v we have for a 100w lamp: 120v = I x R R = 120v/0.83 amps R = 144.6 ohms for a 60w lamp: 120v = I x R R = 120v/0.5 amps R = 240 ohms The higher watt lamp has lower resistance.


If a toaster draws 6.2amp of current at 120v how many kilowatt-hours of energy will be used in 3.5 hours?

Electrical


what is the resistance of circuit with 120v and 10A?

12


What size resistor do you need to power a 14 volt bulb from a 120 outlet?

To find the resistance necessary, one would need to know how much current the bulb draws. If one knows the current the bulb draws, then one would subtract the 14 volts from 120 volts then divide that by the current the bulb draws and one will find the resistance needed. Once this has been done, one would need to multiply the current drawn by the voltage drop to get the wattage rating necessary. Another important detail to note is that the power dissipated by the resistor will be much greater than the power consumed by the bulb itself. Finally if the bulb burns out the voltage across the contacts will be 120V. I would not recommend using this method to drop the voltage for the bulb.


What is the resistance of metal conductor with a current of 2a and connected to a source providing potential difference of 120v?

R = E / I = 120/2 = 60 ohms.


How much voltage does a line with resistance of 10 ohms and a current of 20 amps?

V = i*r v = 2 * 60 v= 120v


What is the resistance of a 1100 W 120V hair dryer?

Resistance=Voltage2/Power =1202/1100 =13.1 (3sf)


How many ohms is a 120 volts?

"Volts" is electrical pressure applied to a circuit; whereas, "ohms" is electrical resistance to that pressure. One cannot determine ohms from voltage without knowing either the current (in "amps") or power (in "watts"). A normal 120V household circuit can handle a maximum of 20 amps, so using ohm's law of resistance = voltage / current, the minimum resistance required in a 120V household circuit would be 6 ohms. Any less than 6 ohms will cause the circuit breaker to trip.


What is the resistance of a lightbulb using Ohm's law?

By Ohm's law, resistance is voltage divided by current, so the resistance of a light bulb can be measured by observing the voltage across it simultaneously with observing the current through it. Interestingly, the hot resistance is significantly different that the cold resistance, so measuring resistance with an ohmmeter will not give a meaningful resistance. This is because the resistance of a light bulb has a positive temperature coefficient. Take a typical 60 W 120V light bulb, for instance... Its cold resistance is about 16 Ohms. Calculate current and power at 120 V and you get 7.5 A and 900 W. The truth is that at 60 W, the bulb pulls 0.5 A and has a resistance of 240 Ohms.