I'm pretty sure it doen't matter, the total resistance of the circuit is what counts, the diagrams I have seen however predominantly put the resistor on the + (anode) side of the LED.
A resistor in parallel with a voltages source will not cause the voltage to drop, theoretically. To get a 20 volt drop you need a resistance in series, and the number of ohms is 20 divided by the current in amps. If the current is unknown or variable, the voltage can't be dropped by using a resistor.
Two Hundred
Of course it depends entirely on the ohm's resistance of the resistor. The higher the resistance, the lower the comparison to a short circuit.
Yes. You can use a voltage divider. Say, for instance, one 1KOhm resistor in series with a 3KOhm resistor. Connect the 3k resistor to the 48 volts and connect the 1k resistor to ground. The 1k resistor will have 12 volts acress it. These resistors need to be at least 1 watt each as they are going to dissipate 0.576 watts and get warm. Now, if you attempt to pull power from the 1k resistor, note that regulation will be poor because the impedance of the load will go in parallel with the 1k resistor and change its value.
To drop 18 volts DC to 16 volts DC, you need a resistor that can handle the current flowing through the circuit. The voltage drop required is 2 volts (18V - 16V). To calculate the resistor value, use Ohm's Law (V = I × R); rearranging gives R = V/I. The specific resistor value depends on the current (I) in the circuit. For example, if the current is 1 amp, you would need a 2-ohm resistor (2V/1A = 2Ω).
If there's nothing else between the ends of the resistor and the power supply, then the voltage across the resistor is 24 volts, and the current through it is 2 amperes.
If there is nothing else in the circuit, then the voltage drop across the resistor will be the full supply voltage of 5 volts. The size of the resistor does not matter in this case - it will always be 5 volts.
9v
In its simplest use a resistor in a circuit is used to limit the amount of current flow, or to decrease the amount of voltage applied to a device. One example is you had a 12 volt battery and you need/ wanted to connect it to a device that ran on 9 volts then a resistor can be chosen to reduce the 12 volts to the 9 volts required.
600 VDC.
Depends on the led forward bias threashold, if its a typical led it will be .7 volts so, .7x6=4.2V, so pick a resistor that will drop around 7 volts. What is the current? Then just to V=IR, 7=IR.
The question is a bit ambiguous, but I will try to address it. If the 6 ohm resistance is in series with another resistance then some of the 5 volts would be dropped across the 6 ohm resistance and the remainder of the voltage would be dropped across the other resistance. To calculate the voltage, use the 'resistor voltage divider equation' (Google it). If the 5 volts is applied across only a 6 ohm resistance, then the top of the resistor is at 5 volts and the bottom of the resistor would be at 0 volts. The resistor would drop all of the voltage.