answersLogoWhite

0


Best Answer

If there is nothing else in the circuit, then the voltage drop across the resistor will be the full supply voltage of 5 volts. The size of the resistor does not matter in this case - it will always be 5 volts.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is the volt drop across a 10k resistor with a 5 volt supply?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Movies & Television

Why 250 ohm resistance using for transmitter calibration?

In this situation, to calibrate a transmitter you need a power circuit and communicator circuit. The Hart communicator used in the calibration process is connected to the power source circuit in parallel. The power source circuit is the one that has ammeter, 250 Ohm resistor, and power source all connected in series. As the transmitter sends output mA, it creates volt drop across the 250 Ohm resister. Let's say the volt drop across the resistor was 1 Volt. Now, back to the Hart communicator. It is a load, meaning there will be a volt drop across the Hart communicator. Since it is in parallel with the power circuit, it is also parallel with the resistor. So, the 1 volt drop across the 250 Ohm resistor will also make 1 volt drop across the Hart communicator. Technically speaking, the 1 volt drop across the Hart communicator is only true if its resistor is also 250 Ohm. However, it does NOT matter what voltage drop is in the Hart communcator. It only sees the "relative" voltage drop changes to measure the changes in transmitter outputs.


What is the result in voltage of applying a 2.7 ohm resistor to a 12 volt DC circuit?

The question has just stated clearly that the applied voltage is 12 volts DC.Provided that the power supply is capable of maintaining its output voltage while supplying some current ... i.e. that the effective internal resistance of the power supply is small ... and that the 2.7 ohm resistor is the only external element connected to the power supply's output, the voltage across the resistor is exactly 12 volts DC.The current through the resistor ... supplied by the 12 volt DC supply ... is 12/2.7 = 4.44 Amperes (rounded).The power dissipated by the resistor ... supplied by the DC supply ... is 122 / 2.7 = 53.23 watts !


How much ohms resistor is needed to drop 12 volts to 5 volts?

The size of the resistor will depend on the load. Let's look at this a bit to see if we can make sense of it. You want to drop the applied voltage to a device from 12 volts AC to 11 volts AC. That means you want to drop 1/12th of the applied voltage (which is 1 volt) across the resistor so that the remaining 11/12ths of the applied voltage (which is 11 volts) will appear across the load. The only way this is possible is if the resistor has 1/11th of the resistance of the load. Here's some simple math. If you have an 11 ohm load and a 1 ohm resistor in series, you'll have 12 ohms total resistance ('cause they add). If 12 volts is applied, the 1 ohm resistor will drop 1 volt, and the 11 ohm load will drop the other 11 volts. A ratio is set up here in this example, and each ohm of resistance will drop a volt (will "feel" a volt) across it. See how that works? If the resistance of the load is 22 ohms and the resistance of the (series) resistor is 2 ohms, each ohm of resistance will drop 1/2 volt, or, if you prefer, each 2 ohms of resistance will drop 1 volt. The same thing will result, and the load will drop 11 volts and the series resistance will drop 1 volt. That's the math, but that's the way things work. You'll need to know something about the load to select a series resistance to drop 1/12th of the applied voltage (which is 1 volt) so your load can have the 11 volts you want it to have. There is one more bit of news, and it isn't good. If your load is a "dynamic" one, that is, if its resistance changes (it uses more or less power over the time that it is "on"), then a simple series resistor won't allow you to provide a constant 11 volts to that load. What is happening is that the effective resistance of the load in changing over time, and your resistor can't "keep up" with the changes. (The resistor, in point of fact, can't change its resistance at all.) You've got your work cut out for you figuring this one out.


What is the amps In a simple electric circuit with a 12 volt supply and a 24 ohm resistor?

0.5 amps


60 ohm and 40 ohm and 20 ohm resistors are all 3 connected in series across a 120VAC source what is the total voltage across 40 ohms resistance?

Total resistance is 120 ohms. The 120VAC will be split evenly over this 120 ohm load, so every ohm of resistance gets a volt. So there will be a 40 volt drop across the 40 ohm resistor.

Related questions

The voltage drop across a resistor is 1.0 V for a current of 3.0 A in the resistor what is the current that will produce a voltage drop of 9.0 V across the resistor?

The resistor is 1/3 of an ohm. A 9 volt drop across the resistor would cause a draw of 27 amps through the resistor. The wattage you would need for that resistor is at least a 243 watts.


Why 250 ohm resistance using for transmitter calibration?

In this situation, to calibrate a transmitter you need a power circuit and communicator circuit. The Hart communicator used in the calibration process is connected to the power source circuit in parallel. The power source circuit is the one that has ammeter, 250 Ohm resistor, and power source all connected in series. As the transmitter sends output mA, it creates volt drop across the 250 Ohm resister. Let's say the volt drop across the resistor was 1 Volt. Now, back to the Hart communicator. It is a load, meaning there will be a volt drop across the Hart communicator. Since it is in parallel with the power circuit, it is also parallel with the resistor. So, the 1 volt drop across the 250 Ohm resistor will also make 1 volt drop across the Hart communicator. Technically speaking, the 1 volt drop across the Hart communicator is only true if its resistor is also 250 Ohm. However, it does NOT matter what voltage drop is in the Hart communcator. It only sees the "relative" voltage drop changes to measure the changes in transmitter outputs.


What is the rule for voltage across each resistor?

Volt across a resistor = resistance x current through the resistor.


Why thermocouple voltmeter is connected across shunt resistance and electronic voltmeter across capacitor?

There is no such a thing as a thermocouple volt meter. A analogue or digital millivolt meter or volt meter is connected across a shunt or parallel with the shunt to measure the current through the resistor. Say the resistor value = 1 Ohm, then by using the Ohm law formula to calculate the current, say the voltage (voltage drop), read on the volt meter is 1.5 Volt that is R*V = A that is, 1Ω*1.5V = 1.5 Amp. Any type of DC volt meter, analogue or digital can be used to measure the voltage across a capacitor if the value of the capacitor is large enough that reading will be true RMS. as long as the supply current (EMF Power) are larger than the load current.


What is the amount of power consumed by a 60 watt 220 volt lamp when it is connected across 110 volt supply?

What is the amount of power consumed by a 60 watt 220 volt lamp when it is connected across 110 volt supply?


What is the voltage drop per led in a 25 led series circuit with 120 volt supply and leds are 3.0 volts 20 ma with a resistor of 5000 ohms?

There is not enough information to answer this question correctly. As the current increases through a diode, the voltage dropped across it increases. This correlation is not a linear function such as a resistance, and a datasheet for the diode or a curve tracer would be necessary to obtain the correct function. To explain further, if there were 25 series LEDs all with a 3 volt drop across them, a 75 volt potential would be measured across the diodes, and 45 volts would be measured across the 5000 ohm resistor. Ohms law will show that 45 volts applied to 5K ohms will equal 9mA. This violates the 20mA criteria in the question.


In order for a 30 volt 90 watt lamp to work properly in a 120 volt supply the required series resister in ohm is?

A 30 volt 90 watt lamp has 3 amps going through it. The series resistor also has 3 amps going through it, by Kirchoff's current law. The voltage across the resistor is 90 volts. With 3 amps, that is 30 ohms. (By the way... The resistor must be rated to carry 270 watts. That is a lot of power for a resistor.)


What is the result in voltage of applying a 2.7 ohm resistor to a 12 volt DC circuit?

The question has just stated clearly that the applied voltage is 12 volts DC.Provided that the power supply is capable of maintaining its output voltage while supplying some current ... i.e. that the effective internal resistance of the power supply is small ... and that the 2.7 ohm resistor is the only external element connected to the power supply's output, the voltage across the resistor is exactly 12 volts DC.The current through the resistor ... supplied by the 12 volt DC supply ... is 12/2.7 = 4.44 Amperes (rounded).The power dissipated by the resistor ... supplied by the DC supply ... is 122 / 2.7 = 53.23 watts !


How much ohms resistor is needed to drop 12 volts to 5 volts?

The size of the resistor will depend on the load. Let's look at this a bit to see if we can make sense of it. You want to drop the applied voltage to a device from 12 volts AC to 11 volts AC. That means you want to drop 1/12th of the applied voltage (which is 1 volt) across the resistor so that the remaining 11/12ths of the applied voltage (which is 11 volts) will appear across the load. The only way this is possible is if the resistor has 1/11th of the resistance of the load. Here's some simple math. If you have an 11 ohm load and a 1 ohm resistor in series, you'll have 12 ohms total resistance ('cause they add). If 12 volts is applied, the 1 ohm resistor will drop 1 volt, and the 11 ohm load will drop the other 11 volts. A ratio is set up here in this example, and each ohm of resistance will drop a volt (will "feel" a volt) across it. See how that works? If the resistance of the load is 22 ohms and the resistance of the (series) resistor is 2 ohms, each ohm of resistance will drop 1/2 volt, or, if you prefer, each 2 ohms of resistance will drop 1 volt. The same thing will result, and the load will drop 11 volts and the series resistance will drop 1 volt. That's the math, but that's the way things work. You'll need to know something about the load to select a series resistance to drop 1/12th of the applied voltage (which is 1 volt) so your load can have the 11 volts you want it to have. There is one more bit of news, and it isn't good. If your load is a "dynamic" one, that is, if its resistance changes (it uses more or less power over the time that it is "on"), then a simple series resistor won't allow you to provide a constant 11 volts to that load. What is happening is that the effective resistance of the load in changing over time, and your resistor can't "keep up" with the changes. (The resistor, in point of fact, can't change its resistance at all.) You've got your work cut out for you figuring this one out.


A resistor Connected to a 9 Volt supply dissipates 3 watts calculate the value of the resistance?

about 27ohms


What is the amps In a simple electric circuit with a 12 volt supply and a 24 ohm resistor?

0.5 amps


How you get 5.6 volts 12 ma from 230 volts supply in current transfomer?

You would not connect a current transformer to a 230 v supply. To get 5.6 v 12 mA you could get a 230 to 6 volt transformer, then drop the supply from 6 to 5.6 using a 33-ohm resistor.