If there is nothing else in the circuit, then the voltage drop across the resistor will be the full supply voltage of 5 volts. The size of the resistor does not matter in this case - it will always be 5 volts.
In this situation, to calibrate a transmitter you need a power circuit and communicator circuit. The Hart communicator used in the calibration process is connected to the power source circuit in parallel. The power source circuit is the one that has ammeter, 250 Ohm resistor, and power source all connected in series. As the transmitter sends output mA, it creates volt drop across the 250 Ohm resister. Let's say the volt drop across the resistor was 1 Volt. Now, back to the Hart communicator. It is a load, meaning there will be a volt drop across the Hart communicator. Since it is in parallel with the power circuit, it is also parallel with the resistor. So, the 1 volt drop across the 250 Ohm resistor will also make 1 volt drop across the Hart communicator. Technically speaking, the 1 volt drop across the Hart communicator is only true if its resistor is also 250 Ohm. However, it does NOT matter what voltage drop is in the Hart communcator. It only sees the "relative" voltage drop changes to measure the changes in transmitter outputs.
The question has just stated clearly that the applied voltage is 12 volts DC.Provided that the power supply is capable of maintaining its output voltage while supplying some current ... i.e. that the effective internal resistance of the power supply is small ... and that the 2.7 ohm resistor is the only external element connected to the power supply's output, the voltage across the resistor is exactly 12 volts DC.The current through the resistor ... supplied by the 12 volt DC supply ... is 12/2.7 = 4.44 Amperes (rounded).The power dissipated by the resistor ... supplied by the DC supply ... is 122 / 2.7 = 53.23 watts !
The size of the resistor will depend on the load. Let's look at this a bit to see if we can make sense of it. You want to drop the applied voltage to a device from 12 volts AC to 11 volts AC. That means you want to drop 1/12th of the applied voltage (which is 1 volt) across the resistor so that the remaining 11/12ths of the applied voltage (which is 11 volts) will appear across the load. The only way this is possible is if the resistor has 1/11th of the resistance of the load. Here's some simple math. If you have an 11 ohm load and a 1 ohm resistor in series, you'll have 12 ohms total resistance ('cause they add). If 12 volts is applied, the 1 ohm resistor will drop 1 volt, and the 11 ohm load will drop the other 11 volts. A ratio is set up here in this example, and each ohm of resistance will drop a volt (will "feel" a volt) across it. See how that works? If the resistance of the load is 22 ohms and the resistance of the (series) resistor is 2 ohms, each ohm of resistance will drop 1/2 volt, or, if you prefer, each 2 ohms of resistance will drop 1 volt. The same thing will result, and the load will drop 11 volts and the series resistance will drop 1 volt. That's the math, but that's the way things work. You'll need to know something about the load to select a series resistance to drop 1/12th of the applied voltage (which is 1 volt) so your load can have the 11 volts you want it to have. There is one more bit of news, and it isn't good. If your load is a "dynamic" one, that is, if its resistance changes (it uses more or less power over the time that it is "on"), then a simple series resistor won't allow you to provide a constant 11 volts to that load. What is happening is that the effective resistance of the load in changing over time, and your resistor can't "keep up" with the changes. (The resistor, in point of fact, can't change its resistance at all.) You've got your work cut out for you figuring this one out.
0.5 amps
Total resistance is 120 ohms. The 120VAC will be split evenly over this 120 ohm load, so every ohm of resistance gets a volt. So there will be a 40 volt drop across the 40 ohm resistor.
The resistor is 1/3 of an ohm. A 9 volt drop across the resistor would cause a draw of 27 amps through the resistor. The wattage you would need for that resistor is at least a 243 watts.
In this situation, to calibrate a transmitter you need a power circuit and communicator circuit. The Hart communicator used in the calibration process is connected to the power source circuit in parallel. The power source circuit is the one that has ammeter, 250 Ohm resistor, and power source all connected in series. As the transmitter sends output mA, it creates volt drop across the 250 Ohm resister. Let's say the volt drop across the resistor was 1 Volt. Now, back to the Hart communicator. It is a load, meaning there will be a volt drop across the Hart communicator. Since it is in parallel with the power circuit, it is also parallel with the resistor. So, the 1 volt drop across the 250 Ohm resistor will also make 1 volt drop across the Hart communicator. Technically speaking, the 1 volt drop across the Hart communicator is only true if its resistor is also 250 Ohm. However, it does NOT matter what voltage drop is in the Hart communcator. It only sees the "relative" voltage drop changes to measure the changes in transmitter outputs.
Volt across a resistor = resistance x current through the resistor.
There is no such a thing as a thermocouple volt meter. A analogue or digital millivolt meter or volt meter is connected across a shunt or parallel with the shunt to measure the current through the resistor. Say the resistor value = 1 Ohm, then by using the Ohm law formula to calculate the current, say the voltage (voltage drop), read on the volt meter is 1.5 Volt that is R*V = A that is, 1Ω*1.5V = 1.5 Amp. Any type of DC volt meter, analogue or digital can be used to measure the voltage across a capacitor if the value of the capacitor is large enough that reading will be true RMS. as long as the supply current (EMF Power) are larger than the load current.
What is the amount of power consumed by a 60 watt 220 volt lamp when it is connected across 110 volt supply?
There is not enough information to answer this question correctly. As the current increases through a diode, the voltage dropped across it increases. This correlation is not a linear function such as a resistance, and a datasheet for the diode or a curve tracer would be necessary to obtain the correct function. To explain further, if there were 25 series LEDs all with a 3 volt drop across them, a 75 volt potential would be measured across the diodes, and 45 volts would be measured across the 5000 ohm resistor. Ohms law will show that 45 volts applied to 5K ohms will equal 9mA. This violates the 20mA criteria in the question.
A 30 volt 90 watt lamp has 3 amps going through it. The series resistor also has 3 amps going through it, by Kirchoff's current law. The voltage across the resistor is 90 volts. With 3 amps, that is 30 ohms. (By the way... The resistor must be rated to carry 270 watts. That is a lot of power for a resistor.)
The question has just stated clearly that the applied voltage is 12 volts DC.Provided that the power supply is capable of maintaining its output voltage while supplying some current ... i.e. that the effective internal resistance of the power supply is small ... and that the 2.7 ohm resistor is the only external element connected to the power supply's output, the voltage across the resistor is exactly 12 volts DC.The current through the resistor ... supplied by the 12 volt DC supply ... is 12/2.7 = 4.44 Amperes (rounded).The power dissipated by the resistor ... supplied by the DC supply ... is 122 / 2.7 = 53.23 watts !
The size of the resistor will depend on the load. Let's look at this a bit to see if we can make sense of it. You want to drop the applied voltage to a device from 12 volts AC to 11 volts AC. That means you want to drop 1/12th of the applied voltage (which is 1 volt) across the resistor so that the remaining 11/12ths of the applied voltage (which is 11 volts) will appear across the load. The only way this is possible is if the resistor has 1/11th of the resistance of the load. Here's some simple math. If you have an 11 ohm load and a 1 ohm resistor in series, you'll have 12 ohms total resistance ('cause they add). If 12 volts is applied, the 1 ohm resistor will drop 1 volt, and the 11 ohm load will drop the other 11 volts. A ratio is set up here in this example, and each ohm of resistance will drop a volt (will "feel" a volt) across it. See how that works? If the resistance of the load is 22 ohms and the resistance of the (series) resistor is 2 ohms, each ohm of resistance will drop 1/2 volt, or, if you prefer, each 2 ohms of resistance will drop 1 volt. The same thing will result, and the load will drop 11 volts and the series resistance will drop 1 volt. That's the math, but that's the way things work. You'll need to know something about the load to select a series resistance to drop 1/12th of the applied voltage (which is 1 volt) so your load can have the 11 volts you want it to have. There is one more bit of news, and it isn't good. If your load is a "dynamic" one, that is, if its resistance changes (it uses more or less power over the time that it is "on"), then a simple series resistor won't allow you to provide a constant 11 volts to that load. What is happening is that the effective resistance of the load in changing over time, and your resistor can't "keep up" with the changes. (The resistor, in point of fact, can't change its resistance at all.) You've got your work cut out for you figuring this one out.
about 27ohms
0.5 amps
You would not connect a current transformer to a 230 v supply. To get 5.6 v 12 mA you could get a 230 to 6 volt transformer, then drop the supply from 6 to 5.6 using a 33-ohm resistor.