Decrease, because W = I (current) x V (voltage), if one increases, the other decreases in proportion to the increase of the other. Ohm's Law states current is directly proportional to the applied voltage and inversely proportional to the resistance of the circuit.
If the current is held constant, the voltage will decrease.
No. Power is constant. Transformers neither increase nor decrease power, except for minor losses. They increase or decrease voltage, and they decrease or increase current, but the product of voltage and current, i.e. power, remains the same.
If all environmental conditions remain constant then the resistance will not change appreciably with applied voltage, but the current will increase. An increase in current will raise the temperature of the conductor which will increase the resistance somewhat.
Their relationship is only dependent on the voltage lost across that resistor; voltage equals resistance times current, so increasing the current for a given voltage will require a decrease in the resistance, and vice versa.
Both technicians are right, and both technicians are wrong, because not enough information is present in the question, nor in their statements. Given constant impedance, current should decrease as voltage decreases, while given constant power, current should increase as voltage decreases.
increasing resistance and keeping current constant
If the current is held constant, the voltage will decrease.
increasing resistance and keeping current constant
No. Power is constant. Transformers neither increase nor decrease power, except for minor losses. They increase or decrease voltage, and they decrease or increase current, but the product of voltage and current, i.e. power, remains the same.
Basically we should also keep in mind knowledge about voltage and current. If we keep voltage constant then by increase in temperature also increase the attenuation, if we keep current constant then attenuation drops by increasing temperature.
If you're talking about an electric motor, increasing the frequency will increase the speed of rotation of the motor, and decreasing the frequency will decrease the speed of rotation of the motor. The other way of controlling a motor is to control the current; increasing the current increases speed, decreasing current decreases speed.
An increase in an electrical current will cause magnetism to increase but a decrease in an electrical current will cause magnetism to decrease.
At least to a certain extent, by increasing the field current. Or In Real Power Plant they decrease the power factor to increase the voltage.
If the ratio of voltage to current is constant, then the circuit is obeying Ohm's Law. If the ratio changes for variations in voltage, then the circuit does not obey Ohm's Law.
If all environmental conditions remain constant then the resistance will not change appreciably with applied voltage, but the current will increase. An increase in current will raise the temperature of the conductor which will increase the resistance somewhat.
If the load resistance is constant, then increasing the voltage will increase the current by the same proportion -i.e. doubling the voltage will double the current.
Resistance increases as temperature increases. If Voltage is held constant then according to Ohm's Law Voltage = Current x Resistance then current would decrease as resistance increases.