answersLogoWhite

0

In general terms, the voltage across a resistor can be calculated by ohms law:

V = I . R

voltage across the resistor = current through the resistor multiplied by the value of the resistor.

So, if the resistor has a value of 100 ohms and the current flowing through the resistor is 10mA then the voltage across the resistor will be 100 x 0.01 = 1 volt.

If the current flows through two resistors connected in series, the voltage will be split over the two resistors according to their resistance.

V = V1+V2 = I . R1 + I . R2

You want V2/(V1+V2)=3V/12V

Using Ohm's law for each voltage:

V2/(V1+V2) = I . R2 / (I . R1 + I . R2)

= R2 / (R1+R2)

You see that the answer to your question is: Any combination of resistors will do the job, provided that

R2/(R1+R2) = 3V/12V

e.g. R2=3Ohm and R1=8Ohm

or R2=3000Ohm and R1=8000Ohm

Depending on your choice, you will get a different current flowing.

In the real world you might need to control the current flowing into your circuit.

For example if the device to be powered is a 3V LED and the required current is 15mA then we can calculate as follows:

12V - 3V = 9V.

9V = 0.015A x R

Therefore R = 9V / 0.015A

Therefore R = 600 ohms.

However, in real life this is usually a terrible way to control voltage for several reasons. First, because the current in the most circuits is not constant (it would be constant for an LED but not for most gadgets). Second, this method wastes energy heating the resistor. Third, it is often not possible to find a resistor of exactly the right value.

So, in real life this problem is usually solved by using a "voltage regulator" which is a simple integrated circuit (often with just three legs).

User Avatar

Wiki User

15y ago

What else can I help you with?

Related Questions