Because higher voltage can carry further.
That answer is too simplistic. The actual reason is as follows: for any given load, the higher the supply voltage, the lower the resulting current. Lower currents mean smaller diameter transmission/distribution conductors can be used and the line losses(I2R) are lower.
It is halved. coz voltage=current * resistance
In Short circuit test High Voltage side is feeded with 2-5% of the High Voltage rating to circulate approximately full load current in low voltage winding by short circuiting it. Low voltage is generally short circuited to facilitate measurements because it is more difficult to measure the quantities at high voltages.
If the resistance of the load is kept more-or-less constant, then the current also becomes larger. On the other hand, if the power of the load is kept more-or-less constant, then the current becomes smaller.
This is commonly a problem called "voltage drop". Simply, if you have any power supplying unit connected to an electrical load, it draws an amount of electrical current based on the load value. The voltage at the supply terminals is higher than on the load terminals, this is due to the resistance (impedance) of the connection wires (cables). The more the connection resistance (more resistance means less cross-sectional area), the less the load voltage (if source voltage and load current are kept constants). Another thing that affects the load voltage, is the load current itself. If the load current increases, the load voltage decreases (if source voltage and connection resistance are kept constants). This means that any load current or connection resistance increase mean that the load voltage will decrease. This is exactly what happens in summer. In summer the ambient air temperature is high and hence all connection network elements (wires, cables, TL...) have higher resistance. Plus, most of the air conditioning units operate in summer (higher load current) which represent a bulk load to the network.
Power in a line remains constant when you pass it through a transformer. P = V*I. When V goes up, I goes down. When you want to distribute electricity, the resistance of the wire carrying it causes losses, following the equation P loss = I^2*R. So you can see that having a very high voltage causes a very low current, which results in lower power losses. The reason the voltage is not kept that high at all times is it is unsafe to have voltage at the level in people's houses. The higher the voltage is, the farther it is able is able to arc across the air and cause shorts.
To prevent the appearance of a dangerously-high secondary voltage across its terminals.
Because the line conductors are uninsulated, and must be kept at a safe distance above ground level.
It is halved. coz voltage=current * resistance
In Short circuit test High Voltage side is feeded with 2-5% of the High Voltage rating to circulate approximately full load current in low voltage winding by short circuiting it. Low voltage is generally short circuited to facilitate measurements because it is more difficult to measure the quantities at high voltages.
A couple of reasons I can think of include: High voltage overhead cable is much cheaper than underground cable Easier to reconductor (upgrade current carrying capacity) Easier to find faults
Ohm's Law: voltage = current * resistance. If resistance is a constant, then voltage is directly proportional to current.
In a circuit with constant voltage, the relationship between current and resistance is inversely proportional. This means that as resistance increases, the current flowing through the circuit decreases, and vice versa.
You should answer this question yourself by doing a couple of examples using Ohm's law I=E/R (current equals voltage divided by resistance. Here they are: Base circuit: 10 volt supply feeding a 10 ohm resistor calculate the current... New circuit: 10 volt supply (voltage kept constant) feeding a 20 ohm resistor (increased resistance) calculate the current... Did the current increase, or decrease? This way you can prove to yourself the answer!
If the resistance of the load is kept more-or-less constant, then the current also becomes larger. On the other hand, if the power of the load is kept more-or-less constant, then the current becomes smaller.
What are the changes to the resistance and the voltage will always increase the current in a circuit
A current transformer is used in high voltage circuits where it is not possible to measure current directly. A CT is a step up transformer with only one turn in primary. There will be as many cores based on the purposes like metering, protection etc. The secondary of a CT should never be kept open circuited bcoz very high flux will be developed in the secondary and hence it may be damaged.
This is commonly a problem called "voltage drop". Simply, if you have any power supplying unit connected to an electrical load, it draws an amount of electrical current based on the load value. The voltage at the supply terminals is higher than on the load terminals, this is due to the resistance (impedance) of the connection wires (cables). The more the connection resistance (more resistance means less cross-sectional area), the less the load voltage (if source voltage and load current are kept constants). Another thing that affects the load voltage, is the load current itself. If the load current increases, the load voltage decreases (if source voltage and connection resistance are kept constants). This means that any load current or connection resistance increase mean that the load voltage will decrease. This is exactly what happens in summer. In summer the ambient air temperature is high and hence all connection network elements (wires, cables, TL...) have higher resistance. Plus, most of the air conditioning units operate in summer (higher load current) which represent a bulk load to the network.