Assuming DC and resistive loads, resistance equals voltage across the load, divided by the current through it. In this case 120/10 or 12 ohms.
By Ohm's Law Voltage = Current x Resistance R = V / I = 120 / 12 = 10 Ohms
V = i*r v = 2 * 60 v= 120v
By Ohm's law, resistance is voltage divided by current, so the resistance of a light bulb can be measured by observing the voltage across it simultaneously with observing the current through it. Interestingly, the hot resistance is significantly different that the cold resistance, so measuring resistance with an ohmmeter will not give a meaningful resistance. This is because the resistance of a light bulb has a positive temperature coefficient. Take a typical 60 W 120V light bulb, for instance... Its cold resistance is about 16 Ohms. Calculate current and power at 120 V and you get 7.5 A and 900 W. The truth is that at 60 W, the bulb pulls 0.5 A and has a resistance of 240 Ohms.
Here is my full question - A typical 120-volt household circuit delivers 350 watts of power to an appliance, and another 10 watts of power are consumed by the circuit. There is no ground fault. a. How much current is carried by the hot wire? b. How much current is carried by the neutral? c. How much current is carried by the grounding conductor? d. Calculate the resistance of the circuit: by "consumed by the circuit" I assume you mean consumed by the wires. Assuming resistive loads only, the total load is 360 watts, thus the current is 3 amps. The current flows in the hot and the neutral.
12
Assuming DC and resistive loads, resistance equals voltage across the load, divided by the current through it. In this case 120/10 or 12 ohms.
Ohms law states that E=I * R, or voltage equals current times resistance. Therefore current equals voltage divided by resistance. 120v divided by 16 ohms equals 7.5 amps.
A circuit breaker is designed to "trip" when more than its rated current passes through the breaker. The current is caused by the 120V across a load of a certain resistance. The wire conducting the current must be sized to the current. For 15 amps you need 14 gauge wire. The breaker will be labeled and will have a current and voltage rating printed on the breaker.
A 40W fluorescent lamp typically draws around 0.33 amperes in a 120V circuit. This is calculated by dividing the power (40W) by the voltage (120V) to get the amperage.
"Volts" is electrical pressure applied to a circuit; whereas, "ohms" is electrical resistance to that pressure. One cannot determine ohms from voltage without knowing either the current (in "amps") or power (in "watts"). A normal 120V household circuit can handle a maximum of 20 amps, so using ohm's law of resistance = voltage / current, the minimum resistance required in a 120V household circuit would be 6 ohms. Any less than 6 ohms will cause the circuit breaker to trip.
You are mad idiots fools
To convert 7.5 VA to amperes, you can use the formula: Amperes = VA / Volts. For example, if the voltage is 120V (typical for household circuits), then 7.5 VA / 120V = 0.0625 amperes.
6
Using the formula Power = Voltage x Current, we can calculate the current: Current = Power / Voltage. Plugging in the values, we get 1500W / 120V = 12.5A. So, a 1500W resistance heater would draw 12.5A of current at 120V.
By Ohm's Law Voltage = Current x Resistance R = V / I = 120 / 12 = 10 Ohms
The formula you are looking for is V = IR where V = Voltage I = Current R = Resistance With some formula manipulation and numbers plugged in you get I = 120V / 9.6Ω I = 12.5A The kettle would have 12.5 volts of current running through it.