You have two known values: P and R. Recall the formula for Power:
Power (watts) = I2 R
Basic algebra will help convert the power equation to solve for current:
Step 1: P/R=I2
Step 2: SQRT(P/R)=I
Ohm's law: Volts = Amps * Ohms, or Amps = Volts / Ohms 12 volts / 0.5 ohms = 24 amps
To calculate amps in a circuit, use the formula: Amps Volts / Ohms. This formula helps determine the current flowing through a circuit based on the voltage and resistance present.
A 15 amp circuit breaker should trip at 15 amps regardless of the load voltages or impedances. If you have 277 volts and 7 ohms, the current would be 39.5 amps and a 15 amp circuit breaker should trip.
In a 12VDC circuit with a 1K load, there will be 12ma of current. (Ohm's law: Volts = Amps * Ohms, so Amps = Volts / Ohms.)
"Volts" is electrical pressure applied to a circuit; whereas, "ohms" is electrical resistance to that pressure. One cannot determine ohms from voltage without knowing either the current (in "amps") or power (in "watts"). A normal 120V household circuit can handle a maximum of 20 amps, so using ohm's law of resistance = voltage / current, the minimum resistance required in a 120V household circuit would be 6 ohms. Any less than 6 ohms will cause the circuit breaker to trip.
The resistance of the circuit will be 46 ohms
ohms is a measure of resistance(R) in a circuit. Watts is a measure of the power(P), in this case lets assume it is the power used by the resistive element (lamp, heater etc). Power(watts)=Current(Amps)x Current(amps) x Resistance(ohms) or Resistance (ohms)=Power(W)/(current x current)
Ohms law states E=I*R. Isolating I we get, I = E/R.I = 60V/12ohms = 5 amps.
Just use Ohm's Law Voltage = Current x Resistance Amps = Voltage Divided By Resistance Amps = 120 / 260
Ohm's law: Volts = Amps * Ohms, or Amps = Volts / Ohms 240 volts / 8500 ohms = 28 milliamps
.9 watts.
The formula you are looking for is Ohms = Volts/Amps. R = E/I.