answersLogoWhite

0


Best Answer

Nowadays, you can buy DC to DC convertors that do the conversion in a single module.

A cheaper solution, however, would be to use a chain of diodes - each diode will drop 0.7 volts so 4 diodes will drop 2.8 volts, reducing 12 volts to 9.2 volts.

User Avatar

Wiki User

14y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

14y ago

A 9 volt 0.25 watt load consumes 28 milliamps. (0.25 / 9) A 3 volt drop at 28 milliamps requires a resistance of 108 ohms (3 / 0.028) and that resistance dissipates 83 milliwatts.

A suitable choice would be a 100 ohm 1/4 watt resistor.

This assumes a relatively constant load. If the load varies, you could use a 3 volt zener diode instead, although 28 milliamps is only 1/3 of the nominal 75 milliamps that a typical zener diode requires for stable operation - check the data sheet for the current to voltage curve.

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

given source voltage is 13.9 V

now, consider two resistances R1 and R2 which in series with the source voltage

let the drop on R2 (V2) be 12V

then drop on R1 be (V1) 13.9-12=1.9V

V2=i X R2=12 V

V1=i X R1= 1.9V

where i=13.9/(R1+R2)

now,

V2/V1 = R2/R1

so, R2/R1 = 12/1.9 = 6.32 (approx.)

so take the values of R1=100 ohm , then R2 becomes 630 ohm

so, voltage across R2 will then act as a source of 12V

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

Series resistor or voltage divider (using two resistors).

This is often done in fixed circuits using a series resistor with the proper resistance and power ratings for the load. For instance, to drop two volts with a 10-amp load you would need a 0.20 ohm resistance. Consider, however, what happens if the voltage or load fluctuate. If the load current triples for a few seconds, the voltage drop on the resistor is also tripled and the power dissipated is 9 times normal, which could be enough to fry a small resistor.

Stock (inexpensive) resistors are usually supplied in sizes of 10 ohms or more, so it may be more practical to use a voltage-divider circuit with two resistors in series across the 12v input and the 10volts taken from the point between the two resistors and ground. The resistors are selected to be in proportion to the required voltage drop. R2= r1/(v.in/v.out - 1).

In your case, dropping 2 volts to get 10 means that the resistor to the ground side must be 5 times larger than the one to the positive side. R2 = 5 x R1.

For instance, 2500 ohms and 500 ohms. Note, for illustration, that if the resistors were the same size, the output voltage would be half the input and if the resistor to ground were twice as big as the other, the output would be 2/3 of the input voltage, regardless of the actual load current.

The 10-amp hypothetical load would mean the 500-ohm resistor would dissipate power of much less than a quarter of a watt ( p = I2 / R) and the 2500 would be even less, plus it is "in parallel" with the actual 100W load.

This answer is:
User Avatar

User Avatar

Wiki User

9y ago

A series resistor does that, but it is not very satisfactory because it depends very closely on the current taken. A 5-volt regulator can reduce 12 v to 5 v and then two series diodes will do the rest.

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

With step down Transformer which is having multi tappings / outputs

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

using resistor as load at output

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How do you reduce 13.9 dc to 12 volts dc?
Write your answer...
Submit
Still have questions?
magnify glass
imp