answersLogoWhite

0


Best Answer

"Volts" is electrical pressure applied to a circuit; whereas, "ohms" is electrical resistance to that pressure. One cannot determine ohms from voltage without knowing either the current (in "amps") or power (in "watts"). A normal 120V household circuit can handle a maximum of 20 amps, so using ohm's law of resistance = voltage / current, the minimum resistance required in a 120V household circuit would be 6 ohms. Any less than 6 ohms will cause the circuit breaker to trip.

User Avatar

Wiki User

14y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How many ohms is a 120 volts?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How many amps are in ohms?

It's not that simple. The basic formula is Volts / Ohms = Amps. For 30 Volts you'd get 0.5 Amps, for 60 Volts you'd get 1 Amp, for 120 Volts you'd get 2 Amps.


When Resister is 120 volts and the current through the resister is 0.5 Amps determine the value of the resistor in ohms?

Using Ohms Law, the answer is 120/0.5 = 240 Ohms.


What ohm value or what combenation of resistors are needed to draw 20amps at 120v?

ohms = volts/amperes 6 ohms = 120 volts / 20 amperes


What is the resistance of a 120-W incandescent lamp connected to a 120-V power supply?

Power (watts) = current (amperes) * voltage (volts) Current (amperes) = voltage (volts)/resistance (ohms) 120 watts = current * 120 volts current = 1 ampere 1 ampere = 120 volts/resistance resistance = 120 ohms


If voltage is 12 volts and ohms is 0.5 how many amps in circuit?

Ohm's law: Volts = Amps * Ohms, or Amps = Volts / Ohms 12 volts / 0.5 ohms = 24 amps


How many watts of heat are being produced by the heating of 16 ohms and is connected to a voltage of 120 volts?

The power in watts is equal to the volts times the amps so that is 120 x 7.5.


1 gram per square meter is equal to how many micro ohms?

The question is strange, like: "How many liters are 120 volts"?


Led circuit for 120 volts?

There can be no answer for this without knowing how strong a resistor is involved. (Ohms)


How is the rating of a light bulb determined for instance 100W 60W is that how many watts an hour to run it?

No, the wattage is determined by the resistance of the filament in the light bulb. The formula to determine the wattage is Watts = Voltage (squared)/Resistance in Ohms. To find the resistance of a 120 volt light bulb use the formula, Resistance in Ohms = Voltage (squared)/Watts. So for a 100 watt bulb at 120 volts the resistance is 120 volts x 120 volts = 14400/100 = 144 ohms. For a 60 watt bulb at 120 volts the resistance is 120 volts x 120 volts = 14400/60 = 240 ohms. As you can see this holds true to Ohm's law, current is inversely proportional to the resistance of the circuit. The higher the resistance of a load, the harder it is for the current to flow. In this case less current results in less light being emitted from the filament in the light bulb.


How many volts would does it take to operate a DVD player with 550 ohms resistance and a current of 0.2 amps?

In the U.S. 120 volts. <<>> Using the equation E = I x R, Volts = Amps x Resistance = 110 volts.


How many volts is 600 ohms?

None. 600 ohms is not a measure of electrical charge (which is what voltage is). Volts = current times resistance.


Voltage drop - explanation in terms of electrons?

If you had a simple circuit of one hot wire (10 ohms), a load (100 ohms), and a neutral wire (10 ohms) at 120 volts; total resistance would be 120 ohms, divide 120 volts by 120 ohms = 1 amp (electrons), current stays the same in a series circuit, so 1 amp would flow through each part of the circuit, 1 amp times 10 ohms equals 10 volts dropped on each wire, 120 - (10 + 10) = 100 volts left for the load, 1 amp through the 100 ohm load proves this