How much power it can dissipate without being damaged.
Resistors are rated in wattage so the lowest wattage rating will be the wattage of the series circuit. It will be able to handle that power any more and the lowest wattage resistor could be damaged and fail.Another AnswerWhen two or more resistors are connected in series, the resistor with the lowest resistance will operate at the highest power. If the power developed by a resistor exceeds its rated power, then the resistor may burn out.
cause its bad,replace with higher wattage rating.too much current !!
LED's are light emmitting diodes. diodes have a voltage threshold that must be reached for them to fully "turn on". A resistor can be used to limit that voltage. As the voltage drop across the diode is increased above the turn on voltage (typically .5 - .7 volts), the diode will emmit light. The LED will only light up so far, so by turning the voltage up significantly more will have a very limitted payback in light output.
The power rating of a resistor is determined by its physical size. The greater its surface area, the better it can dissipate energy, so the higher its power rating. Knowing its power rating and its resistance will determine the maximum voltage that can be applied to it in order to ensure the resulting current doesn't cause the resistor to overheat. This can be determined by manipulating the equation, P = U2/R.
Manipulate the following equation, to make I the subject: P = I2R, where P = power, I =current, and R = resistance.
Resistors are rated in wattage so the lowest wattage rating will be the wattage of the series circuit. It will be able to handle that power any more and the lowest wattage resistor could be damaged and fail.Another AnswerWhen two or more resistors are connected in series, the resistor with the lowest resistance will operate at the highest power. If the power developed by a resistor exceeds its rated power, then the resistor may burn out.
it can't handle more power than that rated wattage. exceed the rating and it burns out.
there might be ways to get the power rating by measuring the size of the resistor. but as the physical size of the resistor increases, its power rating also increases..
The ohms tells you the resistance; the wattage tells you how much heat energy it can handle. Not that any resistor ever lived up to its specifications -- I always needed some type of heat sink. Bottom line -- yes.
cause its bad,replace with higher wattage rating.too much current !!
Wattage.
In order to determine this, it will be necessary to find which resistor 'maxes out' at the lowest voltage. This can be found using the equation Vi=sqrt (Pi*Ri) for each resistor, where Pi is the power rating of resistor i and Ri is the value of resistor i. Once this is found, the power dissipation of each other resistor can be found using the equation Pi=(Vl^2)/Ri, where Vl is the voltage that maxes out the resistor which maxes out at the lowest voltage, and Ri is the resistance of each resistor. The equivalent power rating would then be the sum of the power dissipated across each resistor.
There is no way to make the conversion. The wattage given as that of the resistor is only the amount of power it is dissipating. It's the heat the resistor is radiating. The resistor is said to be using or radiating 270.4 watts. (That's a lot of wattage! How hot would a 270 watt incandescent lamp get? Very hot.) A watt is sometimes called a volt-amp. That's because watts equals volts times amps. And it's easy to see how that works with an example including the approximately 270 watts set down in the question. As regards that approximately 270 watts specified, if the resistor was a 1 ohm resistor and had 270 amps of current flowing through it, the 270 watts would be the wattage the resistor would be radiating. If the resistor was a 270 ohm resistor and had 1 amp of current flowing through it, it would (also) be radiating 270 watts. See how that works? An unlimited number of variations on the theme exist. To find the resistance of the resistor, one of two things must be known in addition to the wattage the resistor is running at. Either the voltage across the resistor or the current flow through the resistor must be specified to "finish" the problem.
A "pull-up" resistor is a resistor used to to perform a specific electronic function - it is not a different type of resistor. A very small current flows through a pull-up resistor so it does not need to be high wattage (1/8 watt is generally fine). The value of a pull-up resistor depends on the resistance of the sensor. If it is simply on or off (no resistance) then a typical pull-up resistor might be 10k ohms.
LED's are light emmitting diodes. diodes have a voltage threshold that must be reached for them to fully "turn on". A resistor can be used to limit that voltage. As the voltage drop across the diode is increased above the turn on voltage (typically .5 - .7 volts), the diode will emmit light. The LED will only light up so far, so by turning the voltage up significantly more will have a very limitted payback in light output.
There is no direct relationship.Power ('wattage') is a measure of the rate at which the resistor can dissipate energy; excessive power means that a resistor cannot dissipate energy fast enough to prevent its temperature becoming excessive -excessive enough to damage the resistor.As the rate at which a resistor can dissipate energy is determined by its physical size, a resistor's power rating(maximum continuous power it can handle without exceeding its rated temperature) depends on the physical size of the resistor.On the other hand, the resistance of a resistor is notaffected by its physical dimensions, as a resistor can be manufactured to any particular value of resistance for whatever physical size is necessary to achieve its rated power.If you know a resistor's rated power and its resistance, then you can calculate the maximum continuous current that resistor can handle without overheating (using the equation: power = current squared x resistance).
The voltage supplying the circuit will be divided across the series resistors in proportion to their resistance. The wattage of the resistors has no effect on the distribution, but if you put an under rated resistor in the circuit, it will fail. For example, if you have a 10v source, and a 1 ohm resistor in series with a 3 ohm resistor, the 1 ohm resistor, being only a quarter of the total resistance, will see a quarter of the voltage, or 2.5 volts. The other 7.5 volts will seen across the 3 ohm resistor. The total power consumed by the circuit is given by P = VI or V2/R or I2R, so for this circuit, the resistors will consume 25 watts (current is 10/4 = 2.5 amps according to Ohms Law), and 10 x 12.5 gives 25 watts. Hope that helps ItAintMe