answersLogoWhite

0

Current is determined by the load. If your load device requires 450 mA, then it will draw that amount of current from the supply.

User Avatar

Wiki User

12y ago

What else can I help you with?

Related Questions

Can a person replace a 12V 350 mA adapter with a12V 450mA adapter?

Yes, the adaptors mA name plate capacity is the maximum amperage that can be drawn from the device. The load of what is plugged into the adaptor is what governs the draw from the adaptor.If the adaptor with the 350 mA maximum load worked, then the 450 ma adaptor has more than enough capacity for the connected device.Just make sure that the adaptor is the same output voltage type. Some are rated at 12 VAC at 450 mA, while others are 12 VDC at 450 mA. The output voltages must match to be interchangeable.


How long for first charge Ni-Cd 7.2V 700mah?

Depends on the charging rate, in Amps, of your charger. Very simply, if the charger current is rated at 450mA, then a 450mAh battery would require 1 hour to charge. Take the batter rating (450mAh) divided by the charger current rating (450mA), which equals 1 hour. If the charger rating is only 100mA, then it would take 4.5 hours to charge the same battery.


What is the voltage of Nintendo ds?

RATING 5.2V-2.3W, battery. 3.7V adapter, input: 5.2-450mA not sure what that means, but it may answer your question.


What harm is there if you use 5 v - 850 mA when 5v - 450 mA is the standard?

It really depends on what you are using. If I understand correctly you're using an electronic device which usually requires 5V and 450mA and you have a power supply that provides 5V with 850mA. With a lot of electronics these days it's important to make sure that the VOLTAGE is correct. If you have the voltage correct, then things like chips work the way they're supposed to, variations in voltage can lead to over or under performing electronics, and potential damage if the difference is too great. Judging from your question it appears that voltage is fine, so onto the issue of Amperage Many electronic devices (such as Mobile/Cellular phone and notebook computers) include charging circuits that govern how your device takes in it's power. So take a HP notebook, many run on 19V, 4.74A chargers. The notebook requires the 19V for all the chips inside to function optimally, but only draws as much current (Amps) as it needs. So if you're doing something low power, and it only requires 2A, you can run it off a 19V, 2A charger, and it'll still be fine. On the flip side, if you can have a bigger charger say 7A, and the notebook will still only draw the 2A, but the other 5A are available if the notebook needs it. In portable electronics, extra amperage generally means that you'll be able to charge your battery faster. HOWEVER, some electronic devices (Such as electric motors without control circuits) will use all the current that is available to it to operate. So if you have a small motor designed for 1A, and you attach a 2A power supply to it then you'll find that the motor will spin twice as fast. This could potentially be dangerous. Outside of the motors I can't think of many examples of devices that will automatically use all amperage available, so I think you'll be ok. Just take a look at your device and if it essentially a device where the power is plugged directly into a motor, then perhaps be a little careful, outside that you'll be fine.