answersLogoWhite

0


Want this question answered?

Be notified when an answer is posted

Add your answer:

Earn +20 pts
Q: What size resistor do you need to drop 18 volts dc to 16 volts dc?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Engineering

What size resistor needed to drop 3 volts dc to 2.5 volts dc?

You will need to take the resistance of the load into account if you are going to design a voltage divider. The resistance of the load can completely change the voltage ratio of a voltage divider if not factored into the calculation. you can measure or read R(load), then R(needed) = 0.8 R(load)


Is it true that the higher a resistance value the higher the voltage drop?

Greater value resistor will absorb more voltage than smaller value. The more voltage absorb with same value current flow, the larger body mass resistor will require. Body size depends on type of resistor (material resistor is made of).


How do you run a motor 208 volts ac in a 230 volts suply?

If your motor is rated for only 208 volts you may not be able to use 230 volts for it. It will run, but may shorten the life of the motor. Motors are rated with a 10% tolerance for voltage. This means that a 208 volt motor has a maximum voltage rating of 228.8 volts. So measure you voltage supply and see if it is below 228.8 with a good RMS voltmeter. If it is, you are good to go. If it is not there are two ways to make it work. 1. Put a high wattage ballast resister in series with the supply voltage to drop the voltage to the motor within range of 187.2 to 228.8 volts. To determine the value of the resistor take the Horse Power of the motor and multiply it by 746. This will give the wattage of the resistor, use one at least 20% larger. Next take the Rated Load Amps (RLA) or Full Load Amps (FLA) of the motor and divide it into the difference between 230 (voltage supply) and the 208 (rated motor voltage), this will get you close to the resistance value of the Ballast resistor you need to use. So a 1/2 HP motor with a 1.2 RLA will require about a 18 ohm, 500 watt ballast resistor. This is not the recommended method, but will work. 2. Install a Buck & Boost transformer rate for the HP of the motor that will Buck the supply voltage for the motor down to 208 volts. This is the recommended way and the only way it should be done. Any good commercial electrical supply house can help you properly size the transformer that you will need.


When resistors are connected in series in a circuit. what are the relationships between the voltage drops across the resistor and the currents through the resistors?

When resistors are connected in series in a circuit . the voltage drop across each resistor will be equal to its resistance, as V=IR, V is direct proportional to R. An A: The relationship is that the current will divide for each paths in a parallel circuit and the voltage drop across each will be the source voltage. In a series circuit the current will remain the same for each component but the voltage will divide to reflect each different component value. And the sum of all of the voltage drops will add to the voltage source


How do you convert a 200 micro-amp DC meter to a 10 volt DC voltmeter?

Start by using Ohm's law. 10 volts and 200 microamps requires 50,000 ohms. From that, subtract the impedance of the meter. Place the final resistor in series with the meter. Add a resistor in series of such a size that when there is 10v it will allow 200 u amps to flow.

Related questions

What is the volt drop across a 10k resistor with a 5 volt supply?

If there is nothing else in the circuit, then the voltage drop across the resistor will be the full supply voltage of 5 volts. The size of the resistor does not matter in this case - it will always be 5 volts.


What size resistor in parallel to reduce from 120 volts to 100 volts?

A resistor in parallel with a voltages source will not cause the voltage to drop, theoretically. To get a 20 volt drop you need a resistance in series, and the number of ohms is 20 divided by the current in amps. If the current is unknown or variable, the voltage can't be dropped by using a resistor.


What size resistor is needed to drop from 5vdc to 3vdc?

use Ohm's law: to drop 2 volts, V = I x R 2 = current x resistance resistance = 2 / current. So you need to know the current


What size resistor is needed to drop 14 volts dc to 12 volts dc?

This question cannot be answered because you did not specify the current.


What size resistor needed to drop 3 volts dc to 2.5 volts dc?

You will need to take the resistance of the load into account if you are going to design a voltage divider. The resistance of the load can completely change the voltage ratio of a voltage divider if not factored into the calculation. you can measure or read R(load), then R(needed) = 0.8 R(load)


What size resistor is needed to drop 24 volts dc to 12volts dc?

Its dependent what will by rated power of the device (current).


What size resistor to change 12 volts to 3 volts for a led?

Assuming that you're talking about 12V DC you would use a 4 ohm resistor. If you mean AC then you would need a step-down transformer with a 4:1 ratio.


How much ohms resistor is needed to drop 12 volts to 5 volts?

The size of the resistor will depend on the load. Let's look at this a bit to see if we can make sense of it. You want to drop the applied voltage to a device from 12 volts AC to 11 volts AC. That means you want to drop 1/12th of the applied voltage (which is 1 volt) across the resistor so that the remaining 11/12ths of the applied voltage (which is 11 volts) will appear across the load. The only way this is possible is if the resistor has 1/11th of the resistance of the load. Here's some simple math. If you have an 11 ohm load and a 1 ohm resistor in series, you'll have 12 ohms total resistance ('cause they add). If 12 volts is applied, the 1 ohm resistor will drop 1 volt, and the 11 ohm load will drop the other 11 volts. A ratio is set up here in this example, and each ohm of resistance will drop a volt (will "feel" a volt) across it. See how that works? If the resistance of the load is 22 ohms and the resistance of the (series) resistor is 2 ohms, each ohm of resistance will drop 1/2 volt, or, if you prefer, each 2 ohms of resistance will drop 1 volt. The same thing will result, and the load will drop 11 volts and the series resistance will drop 1 volt. That's the math, but that's the way things work. You'll need to know something about the load to select a series resistance to drop 1/12th of the applied voltage (which is 1 volt) so your load can have the 11 volts you want it to have. There is one more bit of news, and it isn't good. If your load is a "dynamic" one, that is, if its resistance changes (it uses more or less power over the time that it is "on"), then a simple series resistor won't allow you to provide a constant 11 volts to that load. What is happening is that the effective resistance of the load in changing over time, and your resistor can't "keep up" with the changes. (The resistor, in point of fact, can't change its resistance at all.) You've got your work cut out for you figuring this one out.


What size resistor is needed to allow a 220 volt coil in a relay operate on 277 volts?

You will need to know the amount of current flowing through the coil when 220 volts is applied across it. A resistor in series with the coil will limit the current so that the coil only sees 220 volts. The resistor will need to drop 57 volts. So, 57 volts divided by the current in amps will give you your required resistance. You will need a resistor with a high power dissipation rating with 57 volts across it. Your resistor will probaly need to dissipate several watts. For example: A 220 volt coil with 300 milliamps (.3 amps) will require a resistor of 733 ohms. The power dissipation of the resistor would need to be 17.1 Watts! You might try using a light bulb as a series resistor. Ensure that it can handle 57 volts. To complicate matters, is that AC or DC you are using? AC relays have inductance build in, in order to increase the specific "ac resistance", thus the same coil could use as little as 0,001A so you will need a very low value resistor. Anyway, if any 220V relay uses as much as 300mA, I doubt if you will be able to pick it up with one hand! Such a relay coil will draw about 66W of power! I have a 16A rated contact 230V relay. Its current is 0,0015A that is equivelant to 0.33W at 220V!


What size resistor to drop 9 dc to 3 dc at 1 amp?

330 milli ohms


What size resistor do you need to power a 14 volt bulb from a 120 outlet?

To find the resistance necessary, one would need to know how much current the bulb draws. If one knows the current the bulb draws, then one would subtract the 14 volts from 120 volts then divide that by the current the bulb draws and one will find the resistance needed. Once this has been done, one would need to multiply the current drawn by the voltage drop to get the wattage rating necessary. Another important detail to note is that the power dissipated by the resistor will be much greater than the power consumed by the bulb itself. Finally if the bulb burns out the voltage across the contacts will be 120V. I would not recommend using this method to drop the voltage for the bulb.


What size wire do i need for 45 amps at 200'?

Assuming 120 volts, you would need at least 4 AWG, which would give you a 5.6 percent voltage drop at 56.25 amps (i.e., full-time 45 amps derated for 80 percent design rule). At 240 volts you would only need 6 AWG, giving you a 4.5 percent voltage drop at 45 amps.