Yes. As long as the load stays the same. Voltage equals the resistance of the load times the current or amperage. Or , in this case, as an example, if the load is the same, the voltage is 240 and current is 10 amps. At 120 volts, the current is 20 amps. Current x resistance(or the load)=voltage. With simple math, the equation can be moved around.
Current does not decrease when voltage increases; current also increases.
You may be thinking of a particular case where the power is held constant, but you did not say that in the question.
The general answer is that, all other things being equal, if the voltage increases, then the current will also increase, along with the power.
Another AnswerHigh voltages are used in electricity transmission/distribution systems in order to minimise the load currents in the transmission/distribution lines. This is because, for a given load, the higher the supply voltage, the lower the resulting load current (power being the product of voltage and current).
We can see a similar effect with a simple transformer. The higher the voltage induced into one or other of the windings, the lower the corresponding current.
Electricity transmission/distribution systems operate at very high voltages in order to minimise the value of current flowing in the transmission lines. This is because, for a given load, the higher the voltage, the lower the resulting load current (power being the product of voltage and current). If the value of voltage were allowed to fall, then the load current would then increase to compensate. In transmission lines, this would be undesirable, as the increased current would result in greater voltage drops along the length of the line, and greater line (I2R) losses.
But, for an ordinary, d.c. circuit in which a supply voltage, such as that provided by a battery, supplies, say, a resistor, a decrease in voltage would result in a decrease in the load current.
P = V * I. If power remains constant, then as voltage decreases, current will increase.
Voltage is like a force on conductor's charged particles. The greater the force, the greater the current (in case if the resistance of conductor remains constant).
If the power remains constant then current increase with decrease in voltage
With a constant resistive load when the voltage decreases so will the current.
This is Ohm's law at its finest.
The current rises up rapidly.
A: That will happen anytime the voltage source is not able to provide the power needed for the load. If the load exceed the power available from the source the voltage will be reduced as IR drop from the source
If voltage increases when current remains constant then resistance must also increase. Ohm's Law: Voltage = Current times Resistance.
If the voltage applied across the resistor remains constant, then as the resistance of the resistor decreases, the current through it will increase. Consider Ohm's Law: E = IR In this formula, in order for 'E' to remain constant as 'R' decreases, 'I' must increase. Another form of Ohm's Law: I = E/R If 'E' remains constant, then the value of the fraction increases as its denominator 'R' decreases.
increases
If you have a simple circuit. For eg: One voltage source and one resistor, then the voltage of the circuit will always remain the same, the current however will decrease following Ohms' Law V=I*R. If we have a current source instead of a voltage source, we are forcing the current to be a certain value so if we increase the resistor value the current will remain the same but the voltage will increase.
Nothing, but the current will increase.
If the resistance increases, while the voltage stays the same, current will decrease. Current = voltage divided by resistance
it dies
it also increases
If resistance is increased, current decreases. Ohm's Law: current equals voltage divided by resistance.
A: That will happen anytime the voltage source is not able to provide the power needed for the load. If the load exceed the power available from the source the voltage will be reduced as IR drop from the source
If voltage increases when current remains constant then resistance must also increase. Ohm's Law: Voltage = Current times Resistance.
According to ohms law, V = IR, where V=voltage I= current R = resistance the above formula can also be written as I = V/R, here, resistance is inversely proportional to current. In other words, as resistance increases, current decreases.
increases
If the voltage applied across the resistor remains constant, then as the resistance of the resistor decreases, the current through it will increase. Consider Ohm's Law: E = IR In this formula, in order for 'E' to remain constant as 'R' decreases, 'I' must increase. Another form of Ohm's Law: I = E/R If 'E' remains constant, then the value of the fraction increases as its denominator 'R' decreases.
As pressure increases, temperature increases and volume decreases.
The current decreases due to I=V/R. The ammeter reading will decrease as R is increased.