Want this question answered?
If voltage increases when current remains constant then resistance must also increase. Ohm's Law: Voltage = Current times Resistance.
Ohms Law says Voltage = Current x Resistance. Hence if voltage rises, so will current.
You have it backwards, the resistance controls the current not the current controls the resistance. I = E/R . Your question should read, "If the voltage is constant and the resistance in the circuit is increased what happens to the current?" Say the voltage is 120 volts and the resistance is 30 ohms, I = 120/30 = 4 amps. Now we double the resistance to 60 ohms, then I = 120/60 = 2 amps. So now you can see if you increase the resistance the current drops.
If you have a simple circuit. For eg: One voltage source and one resistor, then the voltage of the circuit will always remain the same, the current however will decrease following Ohms' Law V=I*R. If we have a current source instead of a voltage source, we are forcing the current to be a certain value so if we increase the resistor value the current will remain the same but the voltage will increase.
A: That will happen anytime the voltage source is not able to provide the power needed for the load. If the load exceed the power available from the source the voltage will be reduced as IR drop from the source
If the resistance increases, while the voltage stays the same, current will decrease. Current = voltage divided by resistance
If resistance is increased, current decreases. Ohm's Law: current equals voltage divided by resistance.
If voltage increases when current remains constant then resistance must also increase. Ohm's Law: Voltage = Current times Resistance.
Nothing, but the current will increase.
by the ohms law we can clearly say that the current is the ratio of voltage to the resistance.as the resistance is doubled the current should be halved.
Ohms Law says Voltage = Current x Resistance. Hence if voltage rises, so will current.
Voltage is equal to the Current multiplied by the Resistance.Without changing the resistance, increasing the applied voltage in a circuit will increase current flow. There is a simple, direct relationship between voltage and current. Double the voltage, twice the current will flow. Triple the voltage, and the current will triple. As voltage (E) equals current (I) times resistance (R), when resistance is fixed, what happens to voltage will happen to current.
According to ohms law, V = IR, where V=voltage I= current R = resistance the above formula can also be written as I = V/R, here, resistance is inversely proportional to current. In other words, as resistance increases, current decreases.
Ohm's law applies: Current = Voltage / Resistance As such if you double the resistance of the light bulb you end up with half as much current.
v = i*RIf i goes down then R must go up (assuming v remains the same).AnwerCompletely impractical question. Resistance is not directly affected by voltage or current, so what you describe won't happen!
The current is greater than or equal to (6) divided by (the effective resistance of the circuit).
current would go to a maximum, (if there was voltage present), if there was no voltage, no current would flow. the only thing that would limit the current flow (if voltage is present) is the small resistance of the cables, but say there was no resistance it would be like in a short circuit maximum current would flow at the instant voltage is applied. that is why RCD's work as they should, you want the most amount of current to flow at once because otherwise if the current was limited it would not trip in time to stop someone getting electrocuted.