Nowadays, you can buy DC to DC convertors that do the conversion in a single module.
A cheaper solution, however, would be to use a chain of diodes - each diode will drop 0.7 volts so 4 diodes will drop 2.8 volts, reducing 12 volts to 9.2 volts.
A 9 volt 0.25 watt load consumes 28 milliamps. (0.25 / 9) A 3 volt drop at 28 milliamps requires a resistance of 108 ohms (3 / 0.028) and that resistance dissipates 83 milliwatts.
A suitable choice would be a 100 ohm 1/4 watt resistor.
This assumes a relatively constant load. If the load varies, you could use a 3 volt zener diode instead, although 28 milliamps is only 1/3 of the nominal 75 milliamps that a typical zener diode requires for stable operation - check the data sheet for the current to voltage curve.
given source voltage is 13.9 V
now, consider two resistances R1 and R2 which in series with the source voltage
let the drop on R2 (V2) be 12V
then drop on R1 be (V1) 13.9-12=1.9V
V2=i X R2=12 V
V1=i X R1= 1.9V
where i=13.9/(R1+R2)
now,
V2/V1 = R2/R1
so, R2/R1 = 12/1.9 = 6.32 (approx.)
so take the values of R1=100 ohm , then R2 becomes 630 ohm
so, voltage across R2 will then act as a source of 12V
Series resistor or voltage divider (using two resistors).
This is often done in fixed circuits using a series resistor with the proper resistance and power ratings for the load. For instance, to drop two volts with a 10-amp load you would need a 0.20 ohm resistance. Consider, however, what happens if the voltage or load fluctuate. If the load current triples for a few seconds, the voltage drop on the resistor is also tripled and the power dissipated is 9 times normal, which could be enough to fry a small resistor.
Stock (inexpensive) resistors are usually supplied in sizes of 10 ohms or more, so it may be more practical to use a voltage-divider circuit with two resistors in series across the 12v input and the 10volts taken from the point between the two resistors and ground. The resistors are selected to be in proportion to the required voltage drop. R2= r1/(v.in/v.out - 1).
In your case, dropping 2 volts to get 10 means that the resistor to the ground side must be 5 times larger than the one to the positive side. R2 = 5 x R1.
For instance, 2500 ohms and 500 ohms. Note, for illustration, that if the resistors were the same size, the output voltage would be half the input and if the resistor to ground were twice as big as the other, the output would be 2/3 of the input voltage, regardless of the actual load current.
The 10-amp hypothetical load would mean the 500-ohm resistor would dissipate power of much less than a quarter of a watt ( p = I2 / R) and the 2500 would be even less, plus it is "in parallel" with the actual 100W load.
A series resistor does that, but it is not very satisfactory because it depends very closely on the current taken. A 5-volt regulator can reduce 12 v to 5 v and then two series diodes will do the rest.
With step down Transformer which is having multi tappings / outputs
using resistor as load at output
No
You give it more volts and it will burn out something!!
No. You need 12 volt AC to run a 12 volt AC motor, not 12 volt DC.
Depends on the led forward bias threashold, if its a typical led it will be .7 volts so, .7x6=4.2V, so pick a resistor that will drop around 7 volts. What is the current? Then just to V=IR, 7=IR.
This seems like a question from an electrical course, and is probably best answered by your course materials. It's your test question, not ours, and there won't always be someone to ask the answer of. Earn your diploma.
No.
You need an ac-dc converter to reduce your house voltage to 12 volts
12 volts DC
12 volts DC.
12 volts DC
12 volts DC.
12 volts DC.
dc - the voltage is a constant 12 volts nominally in a normal car battery.
12 volts DC.
12 volts DC.
You have your own answer. It is 1.5 amps.
You will have 24 volts DC.