This question cannot be answered because you did not specify the current.
with no idea of the current drawn by the load, no answer is possible
Its dependent what will by rated power of the device (current).
use Ohm's law: to drop 2 volts, V = I x R 2 = current x resistance resistance = 2 / current. So you need to know the current
There is no relation between the resistor's ohms value and its size. The power of the resistor can be seen by its size. If the power is too small, the resistor can be destroyed.
4 AWG is the initial answer without considering the length. For 210 feet of 4 AWG the resistance is .052 ohms. So using Ohm's Law the voltage drop would be 60 x .052 = 10.92 volts. Usually the drop should be less than 10 % of the supply voltage. So if the supply was 120 volts you would be allowed 12 volts and for 240 volts 24 volts of a drop. So you be okay for using either 120 or 240 volts and 4 AWG wold support the current and the length. There may be other factors in your application, but from what you specified use 4 AWG.
A potentiometer.
Its dependent what will by rated power of the device (current).
If there is nothing else in the circuit, then the voltage drop across the resistor will be the full supply voltage of 5 volts. The size of the resistor does not matter in this case - it will always be 5 volts.
You will need to take the resistance of the load into account if you are going to design a voltage divider. The resistance of the load can completely change the voltage ratio of a voltage divider if not factored into the calculation. you can measure or read R(load), then R(needed) = 0.8 R(load)
use Ohm's law: to drop 2 volts, V = I x R 2 = current x resistance resistance = 2 / current. So you need to know the current
A resistor in parallel with a voltages source will not cause the voltage to drop, theoretically. To get a 20 volt drop you need a resistance in series, and the number of ohms is 20 divided by the current in amps. If the current is unknown or variable, the voltage can't be dropped by using a resistor.
The size of the resistor will depend on the load. Let's look at this a bit to see if we can make sense of it. You want to drop the applied voltage to a device from 12 volts AC to 11 volts AC. That means you want to drop 1/12th of the applied voltage (which is 1 volt) across the resistor so that the remaining 11/12ths of the applied voltage (which is 11 volts) will appear across the load. The only way this is possible is if the resistor has 1/11th of the resistance of the load. Here's some simple math. If you have an 11 ohm load and a 1 ohm resistor in series, you'll have 12 ohms total resistance ('cause they add). If 12 volts is applied, the 1 ohm resistor will drop 1 volt, and the 11 ohm load will drop the other 11 volts. A ratio is set up here in this example, and each ohm of resistance will drop a volt (will "feel" a volt) across it. See how that works? If the resistance of the load is 22 ohms and the resistance of the (series) resistor is 2 ohms, each ohm of resistance will drop 1/2 volt, or, if you prefer, each 2 ohms of resistance will drop 1 volt. The same thing will result, and the load will drop 11 volts and the series resistance will drop 1 volt. That's the math, but that's the way things work. You'll need to know something about the load to select a series resistance to drop 1/12th of the applied voltage (which is 1 volt) so your load can have the 11 volts you want it to have. There is one more bit of news, and it isn't good. If your load is a "dynamic" one, that is, if its resistance changes (it uses more or less power over the time that it is "on"), then a simple series resistor won't allow you to provide a constant 11 volts to that load. What is happening is that the effective resistance of the load in changing over time, and your resistor can't "keep up" with the changes. (The resistor, in point of fact, can't change its resistance at all.) You've got your work cut out for you figuring this one out.
A: LEDS are devices that needs a certain voltage and current typically 1.8 and 20 ma
8 gauge will be sufficient with less than a half volt drop
Assuming that you're talking about 12V DC you would use a 4 ohm resistor. If you mean AC then you would need a step-down transformer with a 4:1 ratio.
You will need to know the amount of current flowing through the coil when 220 volts is applied across it. A resistor in series with the coil will limit the current so that the coil only sees 220 volts. The resistor will need to drop 57 volts. So, 57 volts divided by the current in amps will give you your required resistance. You will need a resistor with a high power dissipation rating with 57 volts across it. Your resistor will probaly need to dissipate several watts. For example: A 220 volt coil with 300 milliamps (.3 amps) will require a resistor of 733 ohms. The power dissipation of the resistor would need to be 17.1 Watts! You might try using a light bulb as a series resistor. Ensure that it can handle 57 volts. To complicate matters, is that AC or DC you are using? AC relays have inductance build in, in order to increase the specific "ac resistance", thus the same coil could use as little as 0,001A so you will need a very low value resistor. Anyway, if any 220V relay uses as much as 300mA, I doubt if you will be able to pick it up with one hand! Such a relay coil will draw about 66W of power! I have a 16A rated contact 230V relay. Its current is 0,0015A that is equivelant to 0.33W at 220V!
330 milli ohms
#8 copper