answersLogoWhite

0


Best Answer

Voltage source inverters use the dc voltage (e.g a capacitor in parallel) as a source while the current source inverer (inductor in series) use the dc current as a source. Please note that voltage can not be changed abruptly in capacitor as current can not be changed abruptly in inductor.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

14y ago

Voltage charging is when you set a specific voltage to charge a battery. Regulate voltage and the system will draw what ever current it requires.

Current charging is when you force a specific current through a battery not concerned about what voltage is required to maintain the current.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

Yes they are the same thing!

***********************

They are certainly not the same thing.

Voltage, or e.m.f. (electromotive force), is the cause of current.

A common analogy is that voltage is similar to pressure in a hydraulic system, while current is the volume of flow which results from a given voltage. In a hydraulic system it might be, for instance, in gallons or litres per second. In electricity it is coulombs per second and is given the name amperes, abbreviated to the upper case A.

This answer is:
User Avatar

User Avatar

Wiki User

9y ago

An ideal current source has an infinite impedance to the outside network -- no energy loss.

An ideal voltage source has a zero impedance to the outside network -- no energy loss.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Is a voltage and current source the same thing?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Engineering

How will you convert a open current circuit to a open voltage circuit?

The first thing you need to know is the internal resistance of the current source, the voltage source will have the same internal resistance. Then compute the open circuit voltage of the current source, this will be the voltage of the voltage source. You are now done.


How do you change current into voltage?

Compute the open load voltage of the current source across its shunt resistance.This voltage becomes the voltage source's voltage.Move the current source's shunt resistance to the voltage source's series resistance.Insert the new voltage source into the original circuit in place of the current source.


If the resistance increases what will happen to voltage and current?

If you have a simple circuit. For eg: One voltage source and one resistor, then the voltage of the circuit will always remain the same, the current however will decrease following Ohms' Law V=I*R. If we have a current source instead of a voltage source, we are forcing the current to be a certain value so if we increase the resistor value the current will remain the same but the voltage will increase.


What happens to current in a circuit if the voltage is halved and the resistance stays the same?

the current doubles.. explanation:V=IR hence I=V/R which means that when the supply voltage is constant ,current is inversely proportional to resistance.thus the current doubles. practically speaking when the resistance of the load(fan ,bulb,refrigerator,....) is less ,it draws more current from the source so as to balance the voltage across it.i.e; to maintain the voltage across it as constant. This answer is absolutely correct if you assume that the current comes from a pure voltage source ( voltage source with zero internal resistance). At the other extreme you could have a current source (such as a very large voltage source in series with a very large resistor), and then the current is practically independent of changes if the external resistance is changed (because the change represents a relatively minute change in the overall resistance). With appropriate circuitry it is possible to devise a situation where the current is practically independent of the changing resistance.


Why do you short voltage source and open current source while using superposition theorem?

You short a voltage source when doing this analysis because you do not know how much current will flow through the voltage source - consider it an undefined value. For the same reason, you open a current source since you know how much current will be flowing through it. This is a simple explanation; I'm sure a more exhaustive, technical one could be made if this is not sufficient.

Related questions

How will you convert a open current circuit to a open voltage circuit?

The first thing you need to know is the internal resistance of the current source, the voltage source will have the same internal resistance. Then compute the open circuit voltage of the current source, this will be the voltage of the voltage source. You are now done.


How do you change current into voltage?

Compute the open load voltage of the current source across its shunt resistance.This voltage becomes the voltage source's voltage.Move the current source's shunt resistance to the voltage source's series resistance.Insert the new voltage source into the original circuit in place of the current source.


If the resistance increases what will happen to voltage and current?

If you have a simple circuit. For eg: One voltage source and one resistor, then the voltage of the circuit will always remain the same, the current however will decrease following Ohms' Law V=I*R. If we have a current source instead of a voltage source, we are forcing the current to be a certain value so if we increase the resistor value the current will remain the same but the voltage will increase.


What happens to current in a circuit if the voltage is halved and the resistance stays the same?

the current doubles.. explanation:V=IR hence I=V/R which means that when the supply voltage is constant ,current is inversely proportional to resistance.thus the current doubles. practically speaking when the resistance of the load(fan ,bulb,refrigerator,....) is less ,it draws more current from the source so as to balance the voltage across it.i.e; to maintain the voltage across it as constant. This answer is absolutely correct if you assume that the current comes from a pure voltage source ( voltage source with zero internal resistance). At the other extreme you could have a current source (such as a very large voltage source in series with a very large resistor), and then the current is practically independent of changes if the external resistance is changed (because the change represents a relatively minute change in the overall resistance). With appropriate circuitry it is possible to devise a situation where the current is practically independent of the changing resistance.


If the voltage is increased and current remains the same, what happens to the resistance?

According to ohms law (R=V/I) if voltage increases the resistance also increases .For example: If voltage (V) becomes 2 times the resistance (R) also increases becomes 2 times keeping the current (I) same


Why do you short voltage source and open current source while using superposition theorem?

You short a voltage source when doing this analysis because you do not know how much current will flow through the voltage source - consider it an undefined value. For the same reason, you open a current source since you know how much current will be flowing through it. This is a simple explanation; I'm sure a more exhaustive, technical one could be made if this is not sufficient.


Is emf current greater than current?

EMF is electromotive force. It is another name for voltage. Voltage is electric potential in joules per coulomb. Current is electric flow, in amperes. Amperes are coulombs per second. Voltage and current are not the same thing, and "emf current", or "voltage current" does not make sense.


Can I use a laptop ac adapter with the same volts but higher amps?

Yes. The current rating listed is the _maximum_ current that the power supply can provide without a drop in voltage.


Is a resistor and a relay the same thing?

No, they are not the same. A resistor is a current reducer and a relay is essentially a switch (using low voltage to switch high voltage on/off)


What happens to the current in a circuit as a capacitor charges?

What happens to the current in a circuit as a capacitor charges depends on the circuit. As a capacitor charges, the voltage drop across it increases. In a typical circuit with a constant voltage source and a resistor charging the capacitor, then the current in the circuit will decrease logarithmically over time as the capacitor charges, with the end result that the current is zero, and the voltage across the capacitor is the same as the voltage source.


What happens to voltage in a series circuit?

Total voltage = the source. The voltage around the circuit is divided proportionally by each of the resistances in line. The current is = the source voltage divided by the sum of all the resistance.


What is the relationship between the size of your source and an electric current?

If the source you're talking about is an ideal voltage source, then the amount of current depends on the size of the source and the total resistance of the circuit connected to it. Ohm's Law tells us that the current, I, is directly proportional to the voltage, V, and inversely proportional to the resistance, R: I = V/R So, increasing the voltage increases the current, whereas decreasing the resistance does the same. There are practical limitations to that, however. In the real world, reducing the resistance to zero does not produce infinite current, as suggested by the formula. Infinite current is produced only by "ideal" voltage sources, which don't exist.