10V
The power lost in a resistor is(the current through the resistor) times (the resistance) watts. That's the same thing as(the voltage across the resistor)/(the resistance)watts.
It means that the resistor will safely dissipate the heat involved in transporting that much power through, without burning up/out. If you exceed that rating, the resistor will become too hot for its own good.Power is always linked with Voltage and Current, and Current is linked with Voltage and Resistance of the resistor. You will do well to remember the tandem of laws:Power [Watts] = Potential [Volts] * Current [Amperes]andCurrent [Amperes] = Potential [Volts] / Resistance [Ohms]For example, if you have a 100ohm resistor rated at 0.25W, then to satisfy that requirement, a voltage of no more than 5V can be applied to it, because 5V / 100ohm = 0.05A, and 0.05A * 5V = 0.25W.
The power dissipated across a resistor, or any device for that matter, is watts, or voltage times current. If you don't know one of voltage or current, you can calculate it from Ohm's law: voltage equals resistance times current. So; if you know voltage and current, power is voltage times current; if you know voltage and resistance, watts is voltage squared divided by resistance; and if you know current and resistance, watts is current squared times resistance.
P = IV Where: P = power in watts, I = Current, and V= Voltage Using ohms law: V = IR where V=Voltage, I = Current, and R= Resistance First solve for I, I = V/R, 12/30 = .4 Then use the power equation: P = .4*12 = 4.8 Watts
All a resistor does is use electrical energy, converting it to heat. so a 10 ohm resistor with 5 volts across it will dissipate 2.5 watts. this will come out as heat, ie, the resistor will get hot.
10V
The power in a resistor (in watts) is simply the product of the current (in amperes) times the voltage (in volts).The power in a resistor (in watts) is simply the product of the current (in amperes) times the voltage (in volts).The power in a resistor (in watts) is simply the product of the current (in amperes) times the voltage (in volts).The power in a resistor (in watts) is simply the product of the current (in amperes) times the voltage (in volts).
The power lost in a resistor is(the current through the resistor) times (the resistance) watts. That's the same thing as(the voltage across the resistor)/(the resistance)watts.
Power dissipation of a resistor or any load is the amount of power (in watts) that is converted to heat, light, or other form of energy. In a resistor, power dissipation is defined by Ohm's law P = I^2 * R Power dissipated equals current through the resistor squared times the resistance in ohms. Since the power is converted to heat, a resistor has a maximum dissipation rating set by the manufacturer, above which the resistor will be damaged.
The current I = 0.18257 amperes. Scroll down to related links and look at "Electrical voltage V, amperage I, resistivity R, impedance Z, wattage P".
the physical size tells how much power it can dissipate (watts)
It means that the resistor will safely dissipate the heat involved in transporting that much power through, without burning up/out. If you exceed that rating, the resistor will become too hot for its own good.Power is always linked with Voltage and Current, and Current is linked with Voltage and Resistance of the resistor. You will do well to remember the tandem of laws:Power [Watts] = Potential [Volts] * Current [Amperes]andCurrent [Amperes] = Potential [Volts] / Resistance [Ohms]For example, if you have a 100ohm resistor rated at 0.25W, then to satisfy that requirement, a voltage of no more than 5V can be applied to it, because 5V / 100ohm = 0.05A, and 0.05A * 5V = 0.25W.
A 12 ohm resistor with 6 volts across it will dissipate 3 watts of power. Current = voltage divided by resistance = 6 / 12 = 0.5 amperes. Power = voltage times current = 6 * 0.5 = 3 watts.
Too much current flowing in circuit. Sounds like a voltage was applied that exceeded the rating of the resistor. Resistors are rated in watts which is Volts times Amps. As an example, you might have a 1/2 watt rated resistor. If you applied 120 VAC across a 10 Ohm resistance then the current would be 12 Amps. The wattage would be 1,440 watts which is well in access of the rating and would certainly burn out the resistor.
P = V*I, and V = I * R, so: P = V^2 / R: P = 16/30 = ~.5 watts.
R = E/I = (12)/(0.1) = 120 ohms(Make it a big one. It dissipates I2R = 0.01 x 120 = 1.2 watts.)
A 12 ohm resistor with 12 V across it will dissipate 12 watts. To determine the temperature of the resistor, you need more information. The resistor and 12 V supply could be inside an oven, in which case it would be very hot, or in Antarctica, in which case it would be very cold.