The voltage drop depends on the current through the cable.
For DC current in cable of 16 mm diameter, at 68° F, the voltage drop is
(0.00857) x (current, Amperes) volts.
It depends on the voltage being used. That is because the size of the cable is determined by its resistance, and to calculate the allowable resistance you need to know the voltage drop. Lower resistance means a thicker cable. Normally the allowable voltage drop is a percentage of the supply voltage, 5% for example. On a 120 v system this would allow a 6 v drop while on a 240 v system the voltage drop could be 12 v. So for a given load current, the cable for a 120 v system would need half the resistance and double the cross-section area than a cable for 240 v. But for a given amount of power, the current on the 240 v system would be halved, so the cable resistance could be four times higher and its cross-section one quarter of that needed for 120 v. That is why higher voltages are used to transmit power over long distances.
Your question cannot be answered, as we do not know what type of wire the copper is coated on, nor do we know how long the wire is. But, let's assume that it is solid copper # 10 wire 100 feet long. The voltage drop is zero when there is no electricity flowing through it. (That is, 'current', measured in Amperes, or just Amps.) If there is, say, 1 Ampere of current, the voltage drop is close to 1/10th Volt in the 100 feet. If there is, say, 15 Amperes of current, the voltage drop is a tiny bit over 1-1/2 Volts. If you double the wire's length, the voltage drop also doubles, and if the current doubles, the voltage drops also doubles. Or, as in my example, above, if the current rises by 15 times, then the voltage drop rises by 15 times.
Over long distances cable loss happens because of the cable's resistance, The more current you create, the more heat generated and lost in the cable itself. It's all basic OHMS law. Thus to transport the electricity from the power generating plant over a national grid (over long distances), you reduce the current by stepping up the voltage using step up Transformers. When you step up the voltage, at the same time you lower the current for the same power. This reduces the line losses caused by the cables resistance. Obviously at the far end of the power transmission cable you need to use a step down transformer to bring the voltage down to the national domestic standard.
Regarding the question of "Why isn't there voltage drop in a 240 volt system?" The smart alex answer is "because no current is flowing." Only if no current is flowing can there be no voltage drop in any circuit. All conductors, even supercondutors, have resistance and this resistance will produce a voltage drop when a current is flowing through the conductor(s). 240 volts is used over 120 (or 12 volts) because it reduces the current reguired for a fixed amount of POWER (watts). Therefore, the conductor size can then be reduced to reduce cost, size, weight or to improve flexibilty.
I would recommend no smaller than #8awg copper. This is derived by 8awg copper ampacity of 40 amps multiplied by 80% load rating to get 32 amps. Then calculating for voltage drop over this distance shows a drop of 5.9 volts or 2.5% which is negligible so not accounted for. So like i said no smaller than #8awg copper.
If the voltage is supplying any current through the cable, i.e. if there is any 'load' at the end, then the voltage will drop through the cable.
There is voltage drop over any sysetm that does not have infinitely low resistance, but with reasonable cable size there would be very little drop over 200 metres. The number of phases makes little difference.
It depends on the voltage that the motor needs, because a higher voltage requires less current for a given amount of power. Also a higher voltage can tolerate a higher voltage drop. So there are two things that lead to a thinner wire when the voltage is higher.
This is a voltage drop question. To answer this question a voltage must be given.
A 2/0 AWG copper conductor will limit the voltage drop to 3% or less when supplying 65 amps for 150 metres on a 240 volt system.
AC card have over/under voltage adjustment for fault. If voltage drop to that range it will trip the fault. Also the unit have voltage adjustment in the bottom right. Voltage coming out of the cable vs voltage coming out Underwood can be different.
Unanswerable. You need to specify the core size and material. Using two 10 AWG wires with a diameter of 2.6 mm each, the voltage drop over 1500 metres would be less than 1 volt.
Voltage drop due to the resistance present in the series circuit causes voltage split over a series circuit.
A wire size of 250 MCM will limit the voltage drop to 3% over a distance of 200 feet.
It depends on the voltage being used. That is because the size of the cable is determined by its resistance, and to calculate the allowable resistance you need to know the voltage drop. Lower resistance means a thicker cable. Normally the allowable voltage drop is a percentage of the supply voltage, 5% for example. On a 120 v system this would allow a 6 v drop while on a 240 v system the voltage drop could be 12 v. So for a given load current, the cable for a 120 v system would need half the resistance and double the cross-section area than a cable for 240 v. But for a given amount of power, the current on the 240 v system would be halved, so the cable resistance could be four times higher and its cross-section one quarter of that needed for 120 v. That is why higher voltages are used to transmit power over long distances.
A voltmeter would measure the voltage. If you measure the voltage drop over a known low resistance you get a kinda-sorta idea of the power available.
60