The size of the resistor will depend on the load. Let's look at this a bit to see if we can make sense of it. You want to drop the applied voltage to a device from 12 volts AC to 11 volts AC. That means you want to drop 1/12th of the applied voltage (which is 1 volt) across the resistor so that the remaining 11/12ths of the applied voltage (which is 11 volts) will appear across the load. The only way this is possible is if the resistor has 1/11th of the resistance of the load. Here's some simple math. If you have an 11 ohm load and a 1 ohm resistor in series, you'll have 12 ohms total resistance ('cause they add). If 12 volts is applied, the 1 ohm resistor will drop 1 volt, and the 11 ohm load will drop the other 11 volts. A ratio is set up here in this example, and each ohm of resistance will drop a volt (will "feel" a volt) across it. See how that works? If the resistance of the load is 22 ohms and the resistance of the (series) resistor is 2 ohms, each ohm of resistance will drop 1/2 volt, or, if you prefer, each 2 ohms of resistance will drop 1 volt. The same thing will result, and the load will drop 11 volts and the series resistance will drop 1 volt. That's the math, but that's the way things work. You'll need to know something about the load to select a series resistance to drop 1/12th of the applied voltage (which is 1 volt) so your load can have the 11 volts you want it to have. There is one more bit of news, and it isn't good. If your load is a "dynamic" one, that is, if its resistance changes (it uses more or less power over the time that it is "on"), then a simple series resistor won't allow you to provide a constant 11 volts to that load. What is happening is that the effective resistance of the load in changing over time, and your resistor can't "keep up" with the changes. (The resistor, in point of fact, can't change its resistance at all.) You've got your work cut out for you figuring this one out.
You want a resistor which drops three volts. It sounds like you are trying to reduce the voltage from a 12 vdc source to a 9 vdc source for another particular device. You need to figure in current into the equation. What needs to be stated is how much current the power supply capable of providing, and how much current you need for your device.
Calculating voltage drop across a resistor requires that you know the current flowing through the circuit. The resistance needed is the voltage drop divided by the current.
The resistor would need to be 1/3rd the resistance of the entire circuit
1000
You will need to know the amount of current flowing through the coil when 220 volts is applied across it. A resistor in series with the coil will limit the current so that the coil only sees 220 volts. The resistor will need to drop 57 volts. So, 57 volts divided by the current in amps will give you your required resistance. You will need a resistor with a high power dissipation rating with 57 volts across it. Your resistor will probaly need to dissipate several watts. For example: A 220 volt coil with 300 milliamps (.3 amps) will require a resistor of 733 ohms. The power dissipation of the resistor would need to be 17.1 Watts! You might try using a light bulb as a series resistor. Ensure that it can handle 57 volts. To complicate matters, is that AC or DC you are using? AC relays have inductance build in, in order to increase the specific "ac resistance", thus the same coil could use as little as 0,001A so you will need a very low value resistor. Anyway, if any 220V relay uses as much as 300mA, I doubt if you will be able to pick it up with one hand! Such a relay coil will draw about 66W of power! I have a 16A rated contact 230V relay. Its current is 0,0015A that is equivelant to 0.33W at 220V!
5 volts
Often we want to easily change a resistor value, so we use a variable resistor. For example, we may want to change the resistor that controls the power sent to a LED, so we can easily make it brighter or dimmer. Often if we use a variable resistor, there is only a very narrow range that is useful. Continuing our example, sometimes we use several LEDs, and we use the variable resistor to set them all to the same brightness. In this case, the resistance range that sets the LED to be twice as bright as the the other LEDs, and the resistance range that sends so much power to the LED that it is permanently destroyed is even less useful. So we add a fixed resistor in series with the variable resistor -- the fixed resistor sets the minimum net resistance, no matter how we turn the knob on the variable resistor. In our example, the addition of the fixed resistor allows us to turn the variable resistor throughout its whole range, and the LED gets brighter and dimmer; without that resistor, a certain range of the knob on the variable resistor would allow so much power to go to the LED that it would be destroyed.
20000 volts
Automobile battery is 12 volts. If it is fully charged you should actually read 12.6 volts.
The diode voltage drop is 0.7 volts, so you need that much to turn it on. Current is controlled by a resistor in series.
Don't follow what a '2200watt resistor' is. A resistor spec is measured in ohms. Ohms Law is expressed as: Voltage drop = current x resistance, and the wattage of the resistor is = volts drop x current. You have to decide if your resistor is 2200 ohms, or is taking 2200 watts. These two alternatives will give different results for the current. If it is 2200 watts, at 110 volts, the current is 20 amps. If it is 2200 ohms, at 110 volts, the current will be 50 milliamps. (0.05amps)
The reactance of the capacitor is 0.339 ohms, therefore the total impedance is sqrt(4002+0.3392) = 400.0001 ohms. So the resistor drops very nearly 20 volts, very slightly less.
To drop a 12 volt source to 6 volts with a resistor, you have to drop 6 volts. The value of the resistor you need would be 6 divided by the current the device pulls in amps. For example, if the device pulls a half an amp the resistor has to be 6/0.5 or 12 ohms. As this device runs on 6 volts and draws 1/2 amp, it's wattage is 3 watts (volts x Amps). Common practice is to double this, or the resistor will probably get too hot and may open. I'd use a 10 watt to resistor to maintain a good margin for safety, and they're readily available. Use a 12 ohm, 10 watt resistor.
Information is inadequate to answer the question
How much current? Volts/Amps = Ohms. In your case Volts = 1.5
You will need to know the amount of current flowing through the coil when 220 volts is applied across it. A resistor in series with the coil will limit the current so that the coil only sees 220 volts. The resistor will need to drop 57 volts. So, 57 volts divided by the current in amps will give you your required resistance. You will need a resistor with a high power dissipation rating with 57 volts across it. Your resistor will probaly need to dissipate several watts. For example: A 220 volt coil with 300 milliamps (.3 amps) will require a resistor of 733 ohms. The power dissipation of the resistor would need to be 17.1 Watts! You might try using a light bulb as a series resistor. Ensure that it can handle 57 volts. To complicate matters, is that AC or DC you are using? AC relays have inductance build in, in order to increase the specific "ac resistance", thus the same coil could use as little as 0,001A so you will need a very low value resistor. Anyway, if any 220V relay uses as much as 300mA, I doubt if you will be able to pick it up with one hand! Such a relay coil will draw about 66W of power! I have a 16A rated contact 230V relay. Its current is 0,0015A that is equivelant to 0.33W at 220V!
To find the resistance necessary, one would need to know how much current the bulb draws. If one knows the current the bulb draws, then one would subtract the 14 volts from 120 volts then divide that by the current the bulb draws and one will find the resistance needed. Once this has been done, one would need to multiply the current drawn by the voltage drop to get the wattage rating necessary. Another important detail to note is that the power dissipated by the resistor will be much greater than the power consumed by the bulb itself. Finally if the bulb burns out the voltage across the contacts will be 120V. I would not recommend using this method to drop the voltage for the bulb.
Voltage / Resistance = Current, you do the math
1,175 watts. Which isn't very feasible As it infers a voltage of 2350 Volts across the resistor. Pls recheck you numbers and resubmit
Ohms are the unit of resistance you find in Ohms LAw which says Volts = Amps x Ohms. You can get a voltage drop across a resistance, but would have to know what current is being used and you would have a potentiometer in effect. You are not "converting 12V" to 10V, your are essentially loosing two volts through a resistor.
Use Ohm's Law, i.e., V=IR here, V=voltage I=current R=resistance