Use the formula: P=IR (power = current x resistance).
Voltage x current = power (watts)
The power used, assuming Unity Power Factor (resistive load), is the product of resistance and the square of the current -- or 1210 Watts.
A resistor placed across the power line: I squared R (current x current x resistance) = heat in watts.
The relation is:P = I2RWhere:I is the current (for example, in amperes)R is the resistance (for example, in ohms)P is the power (energy per second) converted from electrical energy to heat. If the current is in amperes and the resistance in ohms, then power is in watts (equal to joules/second).
In a simple circuit, the amount of voltage, and the resistance of the load. Amps = volts / ohms. For a motor, the back EMF when the rotor is turning reduces the effective voltage across the windings, reducing the current. That is why a motor may burn out if it is powered but cannot turn.
.9 watts.
ohms is a measure of resistance(R) in a circuit. Watts is a measure of the power(P), in this case lets assume it is the power used by the resistive element (lamp, heater etc). Power(watts)=Current(Amps)x Current(amps) x Resistance(ohms) or Resistance (ohms)=Power(W)/(current x current)
P=I^2*R. No. 8,000 watts.
power in watts = voltage in volts x current in amps. or power in watts = current in amps x (resistance in ohms) squared i think what you meant was power in watts =(current in amps)squared x resistance in ohms
Well, first of all, if the resistance of the circuit is 10 ohms and you connect 10 volts to it,then the current is 1 Amp, not 2 . So either there's something else in your circuit thatyou're not telling us about, or else the circuit simply doesn't exist.-- If you connect some voltage to some resistance, then the resistance heats up anddissipates (voltage)2/resistancewatts of power, and the power supply has to supply it.-- If there is some current flowing through some resistance, then the resistance heats up anddissipates (current)2 x (resistance)watts of power, and the power supply has to supply it.-- If there's a circuit with some voltage connected to it and some current flowingthrough it, then the resistance of the circuit is (voltage)/(current) ohms, the partsin the circuit heat up and dissipate (voltage) x (current) watts of power, andthe power supply has to supply it.There's no such thing as "the power of a circuit". The power supply supplies thecircuit with some amount of power, the circuit either dissipates or radiates someamount of power, and the two amounts are equal.
Volts = current (In amps) x Resistance (In ohms) Watts = Volts x Current x PowerFactor Power Factor = 1 in a pure resistive circuit
Voltage x current = power (watts)
The power dissipated across a resistor, or any device for that matter, is watts, or voltage times current. If you don't know one of voltage or current, you can calculate it from Ohm's law: voltage equals resistance times current. So; if you know voltage and current, power is voltage times current; if you know voltage and resistance, watts is voltage squared divided by resistance; and if you know current and resistance, watts is current squared times resistance.
You can't really convert that. If you multiply volts and amperes, you get watts, a unit of power. Watts is equivalent to joules/second. If you multiply volts x amperes x seconds, you get joules.
Power = Voltage*Current. Multiply the current and the voltage. Keep your units in mind. If your voltage is Volts, and your current is in Amps, your power will be in Watts. If you are using milliamps, your power will be in milliwatts. You can also use P=I2*R. The current squared, mulitplied by the resistance of the circuit. Or P=V2/R, the voltage squared divided by the resistance of the circuit. The last two of these can be derived from the basic equation V=I*R and P=V*I. Here's a little helper for you too. "Twinkle twinkle little star, power equals I squared R".
we can calculate the current in a commmon electrical circuit by this formulae i.e,I=V\R where i is the current flowing in the conductor, R is resistance , V is the voltage.. THE FORMULA IS CORRECT but the term conductor does not suffice an explanation since a conductor is low in resistance R= resistance not conduction.
Here is my full question - A typical 120-volt household circuit delivers 350 watts of power to an appliance, and another 10 watts of power are consumed by the circuit. There is no ground fault. a. How much current is carried by the hot wire? b. How much current is carried by the neutral? c. How much current is carried by the grounding conductor? d. Calculate the resistance of the circuit: by "consumed by the circuit" I assume you mean consumed by the wires. Assuming resistive loads only, the total load is 360 watts, thus the current is 3 amps. The current flows in the hot and the neutral.