answersLogoWhite

0


Best Answer

No. You can oversupply current - amperes (A) or milliamperes (mA) - because a device will only use the current it requires, but voltage should match exactly.

Voltage measures the amount of charge in a current (or, strictly, between two points in a circuit). If you oversupply your device's rated voltage you can burn it out. If you undersupply it, you'll either fail to power the device, or you can cause it to function incorrectly, which might or might not have permanent effects, depending on the device. The precise tolerance will vary from one device to another; try to match ratings exactly.

An overly simple metaphor for electric current is your home water supply. Voltage is like water pressure: it's a measure of the force with which current is provided. Current (amperage) is the rate of flow. Multiplied together, they determine total power (watts), which is like the amount of water provided.

As you open more faucets in your house, the total amount of water available to any single tap drops, and so does the rate of flow; but the pressure within the system can be externally regulated and remain the same. Likewise electricity: as you plug more devices into the wall, you consume more power, but the voltage remains constant because it's regulated. If you somehow raise the water pressure in your Plumbing, you can overflow your bathtub, but providing more water at the same pressure just means that you can open more taps before any of them stop providing water.

Your household electricity is probably rated either from 110 VAC to 120 VAC, or from 220 VAC to 240 VAC. This remains the same no matter how many appliances you plug into the wall, and all of them must be designed for that voltage regardless of how much current they actually consume, alarm clock or microwave oven. The individual circuits in your house are usually rated at 15 or 20 amperes (A). This is a measure of the maximum current that they can supply, not of how much current they must supply. Each device has its own current needs, but all devices on the circuit run on the same voltage.

Whether it's AC or DC, any device has a certain current requirement at any given moment. If the voltage being supplied increases from 12V to 19V, that means that the total power delivery increases too, by a factor of just over 50%. It's as if you started forcing water through your pipes half again as fast: with 50% more water, your bathtub would have trouble keeping up. And so will your electronics.

ANSWER: Yes you can provided that an active device is installed in series to limit the voltage to 12 like LM7812. however a 12 volts source cannot be used on a 19 volts circuit because it may impair the circuitry

User Avatar

Wiki User

13y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

13y ago

A: Absolutely however the voltage must be limited to 6 volts by another regulator or a resistor limiter to drop the voltage to 6 volts with the load

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Can yo use a 19v dc power supply for a 6v dc load?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Engineering

What will happen if i connect a power supply of higher current rating to lower current load?

The supply won't have to work as hard. It is perfectly acceptable, for example, to use a 1A, 12v supply to supply a 12v, .5A load. The current rating indicates the ability of the supply to dissipate heat caused by the current flowing. If the load current is above the power supply current rating, the power supply will overheat.


Can you use 36V 16.6 A power supply to 38V 15 A load?

Maybe. The power supply has a nominal rating of 36V and the load as a nominal requirement of 38V, but both values have tolerances.The power supply may in reality support 36V +/- 5%, and the load may accept a supply voltage in the 32..42V range, for example. However, tolerances and real minimum requirements may also work the other way.To find out whether this combination works, you should study the respective technical data sheets. It is also fairly safe to simply try: since the power supply's nominal voltage is less than the load's nominal input voltage, you're unlikely to cause any harm by trying. At your own risk of course.Note that a simple experiment may not be conclusive. The load may be working fine under idle or light use conditions, but when it draws more energy, the power supply may collapse.


Can you use a 12 volts 3 amp power supply on a 12 volt 500 milliamp device?

Your power supply can supply 1 A, but your device requires 2 A. So the power supply will be overloaded. So the simple answer is no.


Why to use kva?

The kVA (kilovolt ampere) is the vector sum of real + reactive power in an AC circuit. The kW (kilowatt) is a measure of the real power in that circuit. Inherently, a circuit will not require only real power, but also reactive power. Thus kVA is a more meaningful value when considering sizing equipment (such as transformers, bus work, breakers, etc.) because this equipment must be sized for the total current drawn, not just the real power usage.


Why does we use 250ohms with power supply for the instrument calibration?

We use the 250 ohms with the power supply because the internal resistance of a DC power supply is insufficient to develop a resistance.

Related questions

Hi can you use a power supply of 19V and 7A with a Laptop which require a 19V and 5A?

No. You have to have 19v and 5a


What is the use of a power supply?

power supply is a device that supplies electric power to electrical load


Can you use a power supply of 19V and 7A with a Laptop which requires 19V and 5A?

Yes. The 7A power supply is the maximum amount of current available, it is not the same as the amount that is actually used by the laptop. This is like asking could you use a firehose for a drink of water. Yes, it would allow you to take more than a cup, but you could still take just a cup.


Is a motor considered to be a load?

Yes. A motor is considered to be a load of the power supply in use.


What will happen if i connect a power supply of higher current rating to lower current load?

The supply won't have to work as hard. It is perfectly acceptable, for example, to use a 1A, 12v supply to supply a 12v, .5A load. The current rating indicates the ability of the supply to dissipate heat caused by the current flowing. If the load current is above the power supply current rating, the power supply will overheat.


Can you use a train transformer to power a cb radio?

For any power supply application you hav eto measure the supply specifications with the load specifications. If the supply is AC the load should be designed to use AC and the same for DC to DC. The voltages should match up. The current that the load requires must be less than or equal to the current that can be provided by the supply. Lastly the wires connecting the supply to the load must be sized to carry the current required by the load.


Will a 15DC 1000mA power supply work on a device needing 15DC 1500mA power supply?

Your question is confusing, but if you are asking whether you can use a 9V/250 mA adapter to supply a load device rated at 5 V/1000 mA, then the rule is quite straightforward. The adapter's rated output voltage must match that of the intended load, but its rated current must exceed that of the load. So in your example, you cannot use the adapter with the intended load.


Can you use a 12v dc 1.6a power supply on devices that require 12v dc 50Ma?

You have to be careful because many small power supplies are poorly regulated, which means that they supply excessive voltage when there is a small current load. A 12 v supply might supply 16-17 v when the load is only 50 mA so unless you can check this it's best not to use it.


How do you calculate load factor when plant machinery is connected with two supply source ie one is from govt. electricity board supply and another is from diesel generator set supply?

LOAD FACTOR = AVERAGE LOAD ÷ PEAK LOAD AVERAGE LOAD = KW-HRS (ENERGY) ÷ NO. OF OPERATING HOURS IF THE LOCAL PUBLIC ELECTRIC COMPANY CANNOT SUPPLY A CERTAIN PLANT DURING PEAK HOURS -- THE SOLUTION IS TO USE A SECONDARY PRIME POWER FROM THE ENDUSER TO MEET DEMAND LOAD. IF THE ARRANGEMENT IS 12 HRS. FOR NORMAL POWER AND 12 HRS. FOR GENSET -- THE LOAD FACTOR IS THE SAME IF THE AVERAGE LOAD IS CLEARLY DEFINED.


Can you use 36V 16.6 A power supply to 38V 15 A load?

Maybe. The power supply has a nominal rating of 36V and the load as a nominal requirement of 38V, but both values have tolerances.The power supply may in reality support 36V +/- 5%, and the load may accept a supply voltage in the 32..42V range, for example. However, tolerances and real minimum requirements may also work the other way.To find out whether this combination works, you should study the respective technical data sheets. It is also fairly safe to simply try: since the power supply's nominal voltage is less than the load's nominal input voltage, you're unlikely to cause any harm by trying. At your own risk of course.Note that a simple experiment may not be conclusive. The load may be working fine under idle or light use conditions, but when it draws more energy, the power supply may collapse.


What is the minimum amount of power supply when a PC uses 225 watts while booting but175 watts when running?

Personally, I wouldn't use less than a 300 Watt power supply in that situation. But then, I never use less than a 500 Watt power supply when replacing a power supply or building a computer. The advantage is, the larger power supply can easily handle the load and will not run as hot. Since electronic components typically fail more rapidly when they get hot, the larger power supply will usually last much longer. But that's just a suggestion.


What are the 3 basic types of connectors on a PC power supply?

AT Power Supply - still in use in older PCs.ATX Power Supply - commonly in use today.ATX-2 Power Supply - recently new standard.